Precise Answers From Google

I had a bit of a head-scratching experience just now: Google gave me a precise answer to a search that was more or less framed as a question; and I can’t recall ever seeing that before. Is this an old feature I’ve never stumbled on or is it something new that’s having a soft launch?

I usually don’t give Google a question, having learned instead to feed it a string of keywords tied in a Boolean knot. But today I asked “how many canals in amsterdam”? The first item in the results was an unequivocal answer — and not one buried in the context material — followed by the reference:


Not all “well-formed” questions evoke firm answers. And neither is it clear to me what a well-formed question requires. I proceeded with the time-honoured method of stabbing about, sometimes dignified as trial and error. I learned that there is some U.S. bias: “how high is the cn tower” failed, but “how high is the empire state building” succeeded.

I had success with “how big is the moon” and, in a segue that made sense at the time, “how big is iowa“. But oddness happened when I asked the next natural question, “how big is alberta”:


It seems that ‘alberta’ conjured up ‘AB’ which in turn evoked a Punjabi Olympic shooter. So I asked the same question about Saskatchewan, figuring that there’d be small chance of confusion here (after all, as the provincial government itself says, “Easy to draw, hard to spell”) — and found success.

I made a couple of feeble attempts to force Google to cough up something definitive in the legal realm, but Google wouldn’t play along. And so for legal workers this feature, if that’s what it is, may be little more than a curiosity. (I might suggest that if Google ever learns to recognize a legal question, it provide the same answer in every case — and they can use me as the authority — “It depends”.)

We’re told that Google is anxious about Wolfram/Alpha, and so this may be part of their attempt to become a knowledge machine as well. Whatever is the case, it’s in an ill-formed state as yet, with no clear (to me) directions on what questions framed in which way will produce answers.


  1. This is very interesting Simon. I had a specific answer to the question “What is an anton pillar order”. It was the right answer too!

  2. A similar search technique that can be more productive than asking the question is to search for part of the answer or the answer with an asterisk in place of the information you seek. For example:

    “the CN tower is * tall”
    “beyond a reasonable doubt means”

  3. On the Google list at the top of the page, click “more”, then on the next page, click “even more”. This gives you some ways to ask what are essentially truncated questions. I suspect it was google’s way of dealing with ask jeeves (now

  4. Opps, forgot to mention to scroll down to “web search features”.

  5. Indeed, David. But what interested me wasn’t so much the best way of getting an answer to my question as it was Google’s presentation of a definitive answer in a form that’s different from the usual spate of web site URLs.

    And Shaunna, I can’t duplicate your success. Whether I try your phrase with or without quotes, all I get is the usual set of results, headed up by a “Web Definitions” URL result, which is not quite the same thing as the declarative statements I’d been getting for questions with numerical answers, at least.

  6. My bad Simon, I misread the screen – what came top of list for my question was a web definition from Wikipedia. I was so excited to see a valid answer to the question I didn’t look further to see the source. Rookie mistake, you would think it was a Monday.

  7. Google has been answering my math questions for a long time. It is a great calculator.

  8. pretty sure this is a direct response to Wolfram Alpha, which got a lot of hype leading up to its release. Check out the full Wikipedia article on it if you’re interested, but the intro quote suffices to make my point:

    Wolfram|Alpha (also written as WolframAlpha and Wolfram Alpha) is an answer engine developed by Wolfram Research. It is an online service that answers factual queries directly by computing the answer from structured data, rather than providing a list of documents or web pages that might contain the answer as a search engine might.[3] It was announced in March 2009 by Stephen Wolfram, and was released to the public on May 15, 2009