We all know very well that today, we all have in our pockets a smart device with the good camera and powerful image processing tools. Hence, now according to the latest reports, the tech giant Microsoft just launched a new tool for your smartphone simply to challenge Google lens.

Microsoft Just Launched A New Tool For Your Phone To Challenge Google Lens

Today, we all have in our pockets a smart device with camera and powerful image processing tools forces technology companies to look for and propose new ways to take advantage of their capabilities. For example, to help us interpret what surrounds us.

This is what tech giants like Google do with Google Lens, however, now after Google, the tech giant Microsoft has just introduced intelligent visual search technology, a new service with very similar functions. The Intelligent Visual Search of the tech giant Microsoft comes integrated into its own products: the Bing search engine, the Edge browser, and the Microsoft Launcher.

The tech giant Microsoft has announced its new functionality today in its official blog. Its visual search is integrated into Bing applications for mobile and proposes similar images in style when it does not find an exact match between its results.

The application uses Microsoft’s computer vision algorithms, a type of artificial intelligence fed with lots of previously defined images so that machine learning ends up being able to identify common features in them.

Sometimes, there is nothing better to explain an image than a photograph. Thanks to artificial intelligence and cloud computing, according to the Bing Images product manager, Vince Leung, we will be able to “accelerate our ability to make sense of pixels”.

The service works very intuitively. In Bing, for example, you have two options: either you search for information by entering a word with the keyboard or you can search by pointing to an object and making the photo.

The result of the second option are all similar images, some even showing related examples or, directly, the same object from which we look for more information. It seems that the service is quite accurate, although a badly lit or out of focus photos can end up delivering very different results.

If you have problems in finding the object, logo or image, the new tool asks that the most relevant area of the photograph should have to be narrowed down for the search that we want to make.

So, what do you think about this? Simply share all your views and thoughts in the comment section below.


Please enter your comment!
Please enter your name here