With a search engine, you type in a keyword and try to find the best matches. It’s like walking into a library and being handed the ten best books about a topic. What we are trying to do with WolframAlpha is to create custom-created reports to answer specific questions. We are computing answers – even if nobody has ever asked that question before, maybe we can work out a report that answers it. It takes human experts to do that, and that is something that the search engine crowd is often skeptical about. They say that something is only good when it is based on a good algorithm and infinitely scalable. But we are interested in encapsulating the world’s knowledge, not in scalability. Wikipedia is basically a container for random texts written by random people at random times. We can surely do better than that, especially if we want to build something that has different layers and relies on good information. The actual data that we have inside of Wolfram Alpha is now roughly comparable to the textual content of the internet, and much of it comes from primary data sources that are not available online.