Wolfram|Alpha was released today. This is a fascinating piece of technology and I am trying to work out how I feel about it. If you don’t know what it is, for a general overview you could read “Wolfram ‘search engine’ goes live” from the BBC or a little more detail from “Ask Alpha: Quizzing the world’s first answer engine” from New Scientist. The technology enthusiast inside me is giddy with excitement but there is a little voice inside me crying caution.
It doesn’t do everything very well yet. For example, it knows “number of people in Nottingham” but not “number of bars in Nottingham” (it doesn’t know how to relate the unit “bars” to a city). But that’s not really the point, we should be interested in potential here. I am interested in how it handles maths particularly and in whether when it fails to answer a question this is because it never can or just can’t yet.
I have been typing in some questions from a ‘fun’ maths quiz used at the University of Nottingham on open days. I shouldn’t list too many here (as they should remain useful!) but an interesting situation has occurred.
One question asks “What is the difference between six dozen dozens and half a dozen dozens?”
It’s a slightly silly question and I’m sure there are more mathematical examples, but I immediately wonder if it could be answered with no knowledge (or, importantly, without acquiring the knowledge) of what a “dozen” is.
I tweaked the question and got a correct response from “difference between half a dozen dozens and six dozen dozens.” I now know the answer is 792 and at no point have I either found out (or been told) what a dozen is.
[An aside: On the subject of an unfinished product: Strangely, “difference between six dozen dozens and half a dozen dozens” (the other way around) doesn’t work. Interestingly, “six dozen dozens minus half a dozen dozens” produces an erroneous result, by implying a bracket in the wrong place. It interprets it as “six dozen times (one dozen minus half a dozen dozens)”. Seems it needs to learn BIDMAS/BODMAS]
This example is merely illustrative. However, I wonder if there are situations where an answer (with “working”) can be received via Wolfram|Alpha by typing key phrases from a coursework question and that answer is completely satisfactory to the assessment method (and marker) but the student has at no point understood what is being asked or what is to be learned. Mind you, as Adam Partridge (AdamJTP) points out to me on Twitter, this is not too different from the many many university students up and down the land who are currently cramming ‘knowledge’ into their heads which will only remain for a few hours while they get through their exam.
The other caveat is that the same answer can be obtained through a Google search for “difference between half a dozen dozens and six dozen dozens” and the student is only slightly more likely to find out what the term means.
So perhaps it doesn’t matter. Google does stuff like this, we’ve had computer algebra for years and Wolfram|Alpha doesn’t work all that well anyway. But, remembering this is a first look at a new type of technology, it makes me uneasy.
Another question gives me another example: “The diameter of a circle of circumference 1 is…?” Wolfram|Alpha makes light work of “the diameter of a circle of circumference 1” (even gives a nice little diagram), a question Google doesn’t cope well with. It is very easy to plug the text of this question into Wolfram|Alpha and get an answer, without the student having to muck out developing an instinct for the properties of a circle. Another example you might give a student to tease out an understanding of the relationship between circumference and diameter is “the diameter of a circle of circumference 12 pi“.
I am glad it doesn’t seem to know how to “list pairs of prime numbers which sum to 999“, a neat little trick I picked up from Math_Bits on Twitter and used successfully with students in York. I am using questions here that are quite basic because Wolfram|Alpha isn’t doing so well with more involved questions – but in many cases there’s no reason it shouldn’t be able to in time. But I think the point I am trying to make here is that sometimes we ask questions so that the student will learn something while thinking about the answer (and the actual answer is immaterial).
In the same way that skill at mental arithmetic shortcuts (and corresponding easy familiarity with numbers) is largely lost in my generation by use of calculators, I worry what this means about more advanced maths. Still, perhaps my unease is just a sign I am getting old and all this means is that questions which explore mathematical concepts need to be better crafted, which we (should) know anyway.
Of course this technology isn’t going to go away. It is a fascinating device for the betterment of humanity and such is progress. But it might force a change in the way certain concepts are taught/learned.
How many of your questions are answerable by Wolfram|Alpha with no need for understanding? Rather, give Wolfram|Alpha your assessment – how well does it score?
I like your article.
I personally would be very explicit about the question I was asking. Computational engines can tell you “what” something is within their capabilities. “Why” and reasoning are not questions they can answer.
As a student myself I use Mathematica to tell me “what” an answer is and to explore difficult problems. I know, however, that it won’t tell me why. I need to reason that for myself.
Consider reasoning this:
f[n_] := (n *(1 + Sin[3*n]))/(n + 1)
Limit[f[n], n -> Infinity]
I might reason wrongly. Having tried something, I can check it, and correct my reasoning.
Also, I’d highlight there isn’t always a need to know why or how. I know why and how my computer works, but many people do not, yet they can still browse the internet. Similarly, I have a basic understanding of my car but the details are beyond me. In a society of increased specialisation this is increasingly the case. Whether that is an issue or not and what standard of education we expect is another matter.