When Written: June 2009
One of the great things about having the opportunity to write for a magazine like PcPro is the chance to meet some of the movers and shakers of our industry, and sometimes get a glimpse of the possible future of computing. Sometimes these glimpses are in the form of a sixty foot multi colour PowerPoint presentation which only the most soporific could miss, others are mere hints at the way forward and it is here that the fun of putting two and two together and hopefully getting four, although something more like 86.34567 is the usual result!
Infer.Net Artificial Intelligence framework, only download if you have the brain the size of a small planet needed to understand it
A few days ago I was listening to a talk by John Guiver who works in the machine learning group at Microsoft research. He was demonstrating a new .NET framework for machine learning (http://research.microsoft.com/en-us/um/cambridge/projects/infernet), this helps programmers to develop artificial intelligence systems.
One of the demonstrations showed an example of how a system could be built that learnt and validated how accurate results returned from a web search were. Perhaps this framework is destined to be plugged into Microsoft’s supposed ‘Google killer’, Bing? Whilst, just because a search engine returns different results to Google does not make it poorer, in fact it may be that the new engine’s results are more valid, Microsoft’s newly launched search engine could certainly do with some help.
As usual with all such things I made Bing my default search engine to see if I could live with it and whilst writing this article I wanted to look for infer.net but nothing useful came up, but Google returned a more relevant set of results. On further checking, as I could not believe that Bing was not indexing Microsoft websites I tried enclosing my search word with inverted commas. This produced the correct set of results with Infer.Net at the top; you could say ‘Bingo!’ at this point, but perhaps not. Obviously the competition to build a better search engine has always been a keen one, even more so now that the revenue from advertising on the web is such big business.
In a recent upgrade Google added ‘Search Options’ which if you are anything like most people I have spoken to, you have probably missed the extra ‘Show options…” link on the bar just below the Google logo. With this you can limit your searches to certain types of content so you can just search forums or reviews and return the most up to date results. These are all very nice tweaks, but I can’t wait until we have a search engine that learns my preferences and filters the sites I consider to be poor or that offer either repeated or irrelevant content.
Perhaps you will be able to select different modes for searching so that in, say, ‘work’ mode the search engine will not return any distracting videos or celebrity gossip. I would love to be able for example when coding to set my search preferences for ASP.Net version 2 or greater so as to eliminate all the articles on .NET version 1 which are frankly of very little use now, as well as then ranking them by date of submission as the latest answer to a problem is probably the one I am looking for.
Whereas someone researching a story may want to find the origonal reference to a subject, you get the idea I hope. More intelligence in our search engines with them learning from users is what we need. A lot of Google’s strength comes, I’m sure, from the large amounts of time spent by webmasters in optimising their web pages so that Google ranks them highly, as well as from the companies involved paying for Adwords. I think we are a long way off but when a search engine learns from me of my preferences it will also be able to target advertising in a way never before achievable, and as such this would be the incentive for such a development rather than anything to make your, or my, on-line lives any easier.
Microsoft’s search engine Bing is different but is it better? Time will tell.
Article by: Mark Newton
Published in: Mark Newton