As computational power increases, and big data gets bigger, what we know as search engines will soon transition into actual decision engines, that can anticipate what you want, and help you make better decisions. This is a topic I discuss at length in What Happens if Data is Speech?
Today, search engines are primarily used to help people find information. Ultimately, the decision of what to do with that information is still made by the user, though informed by what the computer suggests. For some time, Google has made clear that their goal is to help people make better decisions about their lives, and serve as our “constant companion,” capable of offering “the ubiquitous presence of a personal assistant that never stops working.” Google co-founders Larry Page and Sergey Brin “have asserted, the goal is to insert a chip inside your head for the most effortless search engine imaginable.” This is not just science fiction. Google Glass is but the first incarnation of this technology.
Tim Wu referred to such a technology as a “concierge” that can not only find information, but can “g[i]ve advice.” The concierge or adviser is similar to what I have elsewhere defined as “assisted decision making,” That is, relying on intelligent algorithms that can comb through vast amounts of data in order to assist people in making more informed decisions. Both visions of the technology further blur and break down the lines between the computer and the user. As internet technologies evolve from helping users retrieve information to helping users make decisions, and the line between human and program becomes blurred, concerns about constitutional scrutiny for data regulations become more potent.
The Times reports on a new wave of apps that are starting to make a dent in that area–apps that can anticipate what you want.
A range of start-ups and big companies like Google are working on what is known as predictive search — new tools that act as robotic personal assistants, anticipating what you need before you ask for it. Glance at your phone in the morning, for instance, and see an alert that you need to leave early for your next meeting because of traffic, even though you never told your phone you had a meeting, or where it was.
How does the phone know? Because an application has read your e-mail, scanned your calendar, tracked your location, parsed traffic patterns and figured out you need an extra half-hour to drive to the meeting.
This technology actually learns about you from your context, and understands what you want.
The technology is the latest development in Web search, and one of the first that is tailored to mobile devices. It does not even require people to enter a search query. Your context — location, time of day and digital activity — is the query, say the engineers who build these services.
“We can’t go on with eight meetings and 200 e-mails a day,” said N. Rao Machiraju, co-founder and chief executive of reQall, which sells its technology to other companies to make their own personal assistant apps. “We have a technology that isn’t waiting for you to ask it a question, but is anticipating what you need and when is the best time to deliver that.”
“By the time you search, something’s already failed,” said Phil Libin, chief executive of Evernote, a note-taking app that actively shows previous entries related to current circumstances.
Google even speaks in terms of an expert personal assistant.
“You can just imagine several years down the road, if that personal assistant was an expert in every field known to humankind,” said Amit Singhal, Google’s senior vice president for search.