A robot chef in every kitchen?

Advances in robotics, image processing and artificial intelligence are quickly opening the door to new areas of application of robotics. Moley Robotics has been developing the world’s first fully-automated and intelligent cooking robot, which cooks recipes by mimicking the movements of a master cook and (more importantly) clears the kitchen when he is done.

As you can see on the video (also available in Moley website), the robotics hands manipulate food, cooking tools, and other implements in much the same way a human chef would.

moley-robotics-robot-chef

The first prototype was developed in collaboration with Shadow Robotics, Yachtline, DYSEGNO, Sebastian Conran and Stanford. The robot consists of a pair of articulated robotic hands that can reproduce the entire function of human hands with the same speed, sensitivity and movement.

According to Moley, “The cooking skills of Master Chef Tim Anderson, winner of the BBC Master Chef title were recorded on the system – every motion, nuance and flourish – then replayed as his exact movements through the robotic hands.”

It remains unclear how adaptable the system is to changes in the position of the ingredients, tools, and plates, but these are challenges that will become less and serious as the technology evolves.

Image credits: Moley Robotics

Empathy with robots – science fiction or reality?

A number of popular videos made available by Boston Dynamics (a Google company) has shown different aspects of the potential of bipedal and quadrupedal robots to move around in rough terrain, and to carry out complex tasks. The behavior of the robots is strangely natural, even though they are clearly man-made mechanical contraptions.

screen-shot-2016-09-18-at-18-18-32

In a recent interview given at Disrupt SF, Boston Dynamics CEO Marc Raibert put the emphasis on making the robots friendlier. Being around a 250 pound robot that can move very fast may be very dangerous to humans, and the company is creating smaller and friendlier robots that can move around safely inside peoples houses.

This means that this robots can have many more applications, other than military ones. They may serve as butlers, servants or even as pets.

It is hard to predict what sort of emotional relationship these robots may eventually become able to create with their owners. Their animal-like behavior makes them almost likeable to us, despite their obviously mechanic appearance.

In some of these videos, humans intervene to make the jobs harder for the robots, kicking them, and moving things around in a way that looks frustrating to the robots. To many viewers, this may seem to amount to acts of actual robot cruelty, since the robots seem to become sad and frustrated. You can see some of those images around minute 3 of the video above, made available by TechCrunch and Boston Dynamics or in the (fake) commercial below.

Our ideas that robots and machines don’t have feelings may be challenged in the near future, when human or animal-like mechanical creatures become common. After all, extensive emotional attachment to Roombas robotic vacuum cleaners is nothing new!

Videos made available by TechCrunch and Boston Dynamics.

You can now hail a Uber self-driving car

If you are in Pittsburgh, you can now hail a Uber self-driving vehicle, and see for yourself what the fuss is all about. In fact, you can even ask Siri to hail you a Uber car, which will come by itself and take you wherever you want. Or simply use the easy-to-use Uber app that has already changed the world of private transportation so much.

As you can see on the video, the car comes with a resident engineer. However, he or she does is not normally involved in driving the vehicle, and is there mostly to reassure the customers and, possibly, to obey existing regulations.

Although many people (about a third of Americans, the polls say) are still wary of using driverless vehicles, Uber took a step forward and made, this week, the technology available to anyone who wants to try it.

As The Verge reports, the technology still has a few quirks, but the self-guidance systems of the Ford Fusion cars used manage to address most of the challenges normally posed to a Pittsburgh driver.

Uber users in other cities will have to wait a little more, as the system, extensively developed by CMU researchers, is certainly more mature to be used in Pittsburgh that in other places around the world.

Video source: Uber

Microsoft HoloLens merges the real and the virtual worlds

The possibility to superimpose the real physical world and the virtual world created by computers has been viewed, for a long time, as a technology looking for a killer application.

The fact is that, until now, the technology was incipient and the user experience less than perfect. Microsoft is trying to change that, with their new product, Microsoft HoloLens. As of April this year, Microsoft is shipping the pre-production version of HoloLens, for developers.

The basic idea is that, by using HoloLens, computer generated objects can be superimposed with actual physical objects. Instead of using the “desktop” metaphor, users will be able to deploy applications in actual physical space. Non-holographic applications run as floating virtual screens  that will stick to a specific point in the physical space or move with the user. Holographic enabled applications will let you to use the physical space for virtual objects as you would for physical objects. For instance, if you leave a report, say, on top of a desk, it will stay there until you pick it up.

hololensThe IEEE Spectrum report on the device, by Rod Furlan, provides some interesting additional information and gives the device a clear  “thumbs up”.

The HoloLens, a self-contained computer weighting 580 grams, is powered by a 32-bit Intel Atom processor and Microsoft’s custom Holographic Processing Unit (HPU).

The following YouTube video, made available by Microsoft, gives some idea of what the product may become, once sufficiently powerful applications are developed.

Image and video credits: Microsoft HoloLens website.

The first beers designed by a machine learning algorithm

IntelligentX, a London based brewing company, started to design beers using a machine learning algorithm. The algorithm, called Automated Brewing Intelligence (ABI), collects customer feedback data using a Facebook Messenger robot crawler and uses this information to improve the quality of the beer.

The use of machine learning enables the company to adapt the recipe and the brewing method, in order to improve the perceived quality of the beer. As of now, the company offers four basic beers, including a classic British golden ale, a British bitter, a hoppy American pale ale and a smokey Marmite brew.

ai_beer

As the company explains, in a video available at their webpage, the algorithm used the data gathered from the users to adjust the many parameters involved in creating a beer, including its composition and brewing process.

So, being a master brewer is no longer a job safe from the challenges of AI. Which job will be the next to go? Sommelier? Winemaker? Chef?

Image and video credits: IntelligentX website.

Computers will always follow instructions. That may be the problem…

Many pessimistic scenarios about machines taking control of the world and harming humans are based on the idea that computers will eventually develop self-consciousness and define their own goals, incompatible with the goals of humanity. This is the basis of the argument of many science-fiction movies and books.

Many people believe, however, that this will not be the main problem. As reported in many news outlets, the University of California at Berkeley (my alma matter) has launched the Center for Human-Compatible Artificial Intelligence. The center will be headed by Stuart Russell, a famous expert in Artificial Intelligence (and  co-author, with Peter Norvig, of the most used textbook in the field, Artificial Intelligence: A Modern Approach). Russell has been a vocal advocate for incorporating human values into the design of AI, in order to avoid the pitfall that may come from AI systems running amok.

According to Stuart Russell, the issue is “that machines as we currently design them in fields like AI, robotics, control theory and operations research take the objectives that we humans give them very literally“. Therefore, they may approach tasks with an objective that is simply too literal. For instance, if instructed to solve the problem of “global warming”, a machine may decide that the most effective way is to wipe out the human race.

robotknot750-410x273

According to the UC Berkeley press release, the center is being launched with a grant of $5.5 million from the Open Philanthropy Project, with additional grants from the Leverhulme Trust and the Future of Life Institute.

The center will work on mechanisms to guarantee that the AI systems of the future will act, by design, in a way that is aligned with human values. According to Stuart Russell, “AI systems must remain under human control, with suitable constraints on behavior, despite capabilities that may eventually exceed our own. This means we need cast-iron formal proofs, not just good intentions.

Image credits: UC Berkeley. The image illustrates BRETT, the Berkeley Robot for the Elimination of Tedious Tasks, tieing a knot after watching others demonstrate it.

Is there life out there?

As reported in an article in the journal Nature, Proxima Centauri (pictured), the star nearest to our sun, has an Earth sized planet, orbiting the “Goldilocks” zone (not too hot, not too cold).

The recently discovered planet orbits the mother star in 11 days, an orbit much smaller and much closer to its sun than the orbit of the Earth. However, since Proxima Centauri is a red dwarf, it is much cooler than our sun, which makes this orbit to be just the right size. The planet, named Proxima Centauri b, weights between 1.3 and 3 times the Earth, which makes it likely that it may be a rocky planet. The distance to the star makes it possible that it may exhibit liquid water.

proxima

This combination of factors makes it the planet most likely to help us obtain additional information about the possible existence of life outside of Earth. Earth based instruments, such as the European Southern Observatory, ESO, an array of telescopes in the Atacama desert, in Chile, will be able to obtain additional information.

ESO was involved in the discovery of Proxima Centauri b, and likely to play an important role in the discovery of further information about this planet that, in astronomical terms, lies tantalising close to Earth, at “only” 4.2 light-years. Sending a spacecraft out to that planet may also be a possibility, albeit a very challenging one.

The challenges involved in obtaining further information about this planet are significant, but not unsurmountable, as the Economist reports. In a few years, we may have some better answers to Fermi’s famous question, “Where are they?”, referring to the possibility of extra-terrestrial life.