From September 2009 to August 2012, I used as my everyday web search engine an
app that merges results from several search engines. This page shows some stats
extracted from my tracked click data, in order to help answer the question that
kind of sparked the whole project, "which engine gives the best results?"

For some background see here:

Select a period to analyze

Displaying only clicks recorded since Nov 12 2018.

Select a different time period:
 * last year
 * last 3 months
 * last month
 * last week
 * everything

Clickthrough rates

  With engines hidden and fair blurb selection:

Each line shows, for each engine, the ratio of the number of times they were
invoked to the number of times one of their results was clicked.

If an engine always returned off-topic results, I would never click on their
results, and so their clickthrough rate would be 0. If an engine always
returned results that seemed so relevant that every result I ever clicked on
had been returned by them (along with possibly other engines), then that
engine's clickthrough rate would be 100%.

The number in square brackets is the average rank, within the engine's result
page, of the results that I clicked on. So smaller values mean I tend to click
on results that are higher in that engine's results, higher values mean I tend
to dig deeper in their results.

As the headline suggests, these numbers exclude the times that I opted to see
which engine had contributed which results, as my opinion about engines might
skew my choices (I rarely select to see the engine names, except when debugging
this app itself).

Some engines were invoked fewer times because after a while I stopped using

User votes

Since 2011-05-20, every search result has two links next to it that allow the
user (that is, me) to flag results that stand out as signficantly better or
worse than the rest of the lot.

Here are the tallies. The numbers next to each engine indicate the number of
upvotes ("+") and downvotes ("-") the engine has received. The percentages in
brackets indicate how many votes that engine receives on average every time it
is invoked:

An alternative way of scoring these votes is to give more weight to results
that are ranked higher by the engines. This seems intuitive: if for instance I
downvote a link that was returned by two engines, and the downvoted link
occupied the top position in engine A's results, but only the 30th position in
engine B's results, then it sounds reasonable that more blame should go to
engine A than to engine B: it's not so bad to return irrelevant links in 30th
position as it is in the top position.

In the table that follows, each value is the sum, for all votes on links
returned by that engine, of the inverse of the rank that the engine gave to the
link. So for instance in the previous example, the downvote would add a full
negative point to the tally for engine A, but only 1/30th of a point for
engine B.

Engine correlation matrix


The value in the cell at the intersection of row R and column C answers the
question "among all search results returned by search engine R that were ever
clicked, what percentage had also been returned by engine C?".

This means that if a cell has value 100%, the results returned by the engine in
that row are a strict subset of the results returned by the engine in that
column (assuming that this is independent of whether I've clicked or not on the


The results have been pretty consistent over time: Google emerges as the clear
winner no matter how you measure it. Their results are more relevant and
complete than any other engine's.

The engines complement each other quite well, however -- Bing often has
relevant results that Google doesn't have, for instance. And 1 out of 4 results
that I click didn't appear in Google at all.

Another interesting finding is that the quality of Yandex and Google results is
slowly but surely improving all the time, while other engines seem more stable.

Last recorded click: Fri Feb  1 11:30:39 2013 UTC