A robot chef in every kitchen?

Advances in robotics, image processing and artificial intelligence are quickly opening the door to new areas of application of robotics. Moley Robotics has been developing the world’s first fully-automated and intelligent cooking robot, which cooks recipes by mimicking the movements of a master cook and (more importantly) clears the kitchen when he is done.

As you can see on the video (also available in Moley website), the robotics hands manipulate food, cooking tools, and other implements in much the same way a human chef would.

moley-robotics-robot-chef

The first prototype was developed in collaboration with Shadow Robotics, Yachtline, DYSEGNO, Sebastian Conran and Stanford. The robot consists of a pair of articulated robotic hands that can reproduce the entire function of human hands with the same speed, sensitivity and movement.

According to Moley, “The cooking skills of Master Chef Tim Anderson, winner of the BBC Master Chef title were recorded on the system – every motion, nuance and flourish – then replayed as his exact movements through the robotic hands.”

It remains unclear how adaptable the system is to changes in the position of the ingredients, tools, and plates, but these are challenges that will become less and serious as the technology evolves.

Image credits: Moley Robotics

Empathy with robots – science fiction or reality?

A number of popular videos made available by Boston Dynamics (a Google company) has shown different aspects of the potential of bipedal and quadrupedal robots to move around in rough terrain, and to carry out complex tasks. The behavior of the robots is strangely natural, even though they are clearly man-made mechanical contraptions.

screen-shot-2016-09-18-at-18-18-32

In a recent interview given at Disrupt SF, Boston Dynamics CEO Marc Raibert put the emphasis on making the robots friendlier. Being around a 250 pound robot that can move very fast may be very dangerous to humans, and the company is creating smaller and friendlier robots that can move around safely inside peoples houses.

This means that this robots can have many more applications, other than military ones. They may serve as butlers, servants or even as pets.

It is hard to predict what sort of emotional relationship these robots may eventually become able to create with their owners. Their animal-like behavior makes them almost likeable to us, despite their obviously mechanic appearance.

In some of these videos, humans intervene to make the jobs harder for the robots, kicking them, and moving things around in a way that looks frustrating to the robots. To many viewers, this may seem to amount to acts of actual robot cruelty, since the robots seem to become sad and frustrated. You can see some of those images around minute 3 of the video above, made available by TechCrunch and Boston Dynamics or in the (fake) commercial below.

Our ideas that robots and machines don’t have feelings may be challenged in the near future, when human or animal-like mechanical creatures become common. After all, extensive emotional attachment to Roombas robotic vacuum cleaners is nothing new!

Videos made available by TechCrunch and Boston Dynamics.

You can now hail a Uber self-driving car

If you are in Pittsburgh, you can now hail a Uber self-driving vehicle, and see for yourself what the fuss is all about. In fact, you can even ask Siri to hail you a Uber car, which will come by itself and take you wherever you want. Or simply use the easy-to-use Uber app that has already changed the world of private transportation so much.

As you can see on the video, the car comes with a resident engineer. However, he or she does is not normally involved in driving the vehicle, and is there mostly to reassure the customers and, possibly, to obey existing regulations.

Although many people (about a third of Americans, the polls say) are still wary of using driverless vehicles, Uber took a step forward and made, this week, the technology available to anyone who wants to try it.

As The Verge reports, the technology still has a few quirks, but the self-guidance systems of the Ford Fusion cars used manage to address most of the challenges normally posed to a Pittsburgh driver.

Uber users in other cities will have to wait a little more, as the system, extensively developed by CMU researchers, is certainly more mature to be used in Pittsburgh that in other places around the world.

Video source: Uber

Microsoft HoloLens merges the real and the virtual worlds

The possibility to superimpose the real physical world and the virtual world created by computers has been viewed, for a long time, as a technology looking for a killer application.

The fact is that, until now, the technology was incipient and the user experience less than perfect. Microsoft is trying to change that, with their new product, Microsoft HoloLens. As of April this year, Microsoft is shipping the pre-production version of HoloLens, for developers.

The basic idea is that, by using HoloLens, computer generated objects can be superimposed with actual physical objects. Instead of using the “desktop” metaphor, users will be able to deploy applications in actual physical space. Non-holographic applications run as floating virtual screens  that will stick to a specific point in the physical space or move with the user. Holographic enabled applications will let you to use the physical space for virtual objects as you would for physical objects. For instance, if you leave a report, say, on top of a desk, it will stay there until you pick it up.

hololensThe IEEE Spectrum report on the device, by Rod Furlan, provides some interesting additional information and gives the device a clear  “thumbs up”.

The HoloLens, a self-contained computer weighting 580 grams, is powered by a 32-bit Intel Atom processor and Microsoft’s custom Holographic Processing Unit (HPU).

The following YouTube video, made available by Microsoft, gives some idea of what the product may become, once sufficiently powerful applications are developed.

Image and video credits: Microsoft HoloLens website.

The first beers designed by a machine learning algorithm

IntelligentX, a London based brewing company, started to design beers using a machine learning algorithm. The algorithm, called Automated Brewing Intelligence (ABI), collects customer feedback data using a Facebook Messenger robot crawler and uses this information to improve the quality of the beer.

The use of machine learning enables the company to adapt the recipe and the brewing method, in order to improve the perceived quality of the beer. As of now, the company offers four basic beers, including a classic British golden ale, a British bitter, a hoppy American pale ale and a smokey Marmite brew.

ai_beer

As the company explains, in a video available at their webpage, the algorithm used the data gathered from the users to adjust the many parameters involved in creating a beer, including its composition and brewing process.

So, being a master brewer is no longer a job safe from the challenges of AI. Which job will be the next to go? Sommelier? Winemaker? Chef?

Image and video credits: IntelligentX website.