Featured

Research moves closer to brain-machine interface autonomy

Summary: Findings allow for the development of an autonomously updating brain-machine interface, which is able to improve on its own by learning about its subject without additional programming. The system could help develop new robotic prosthetics, which can perform more naturally. Source: University of Houston A University of Houston engineer is reporting in eNeuro that a brain-computer interface, a form of artificial intelligence, can sense when its user is expecting a reward by examining the interactions between single-neuron activities and the information flowing to these neurons, called the local field potential. Professor of biomedical engineering Joe Francis reports his team’s findings allow for the development of an autonomously updating brain-computer interface (BCI) that improve...

Six fingers per hand: Polydactyly equals extra motor ability

Summary: Polydactyly, a condition where one is born with an extra finger, has significant benefits when it comes to motor skill and control. fMRI neuroimaging reveals those with extra fingers are able to move the digits independently of other fingers. The findings could help with the development of new prosthetics that extend motor abilities. Source: University of Freiburg Polydactyly is the extraordinary condition of someone being born with more than five fingers or toes. In a case study published in Nature Communications, researchers from the University of Freiburg, Imperial College London, the University Hospital of Lausanne, and EPFL have for the first time examined the motor skills and sensorimotor brain areas in people with polydactyly. The results show that an extra finger can signi...

Dog-like robot made by students jumps, flips and trots

Summary: Stanford Doggo, a student-designed dog-like robot, can perform a variety of acrobatic tricks and traverse challenging environments. The student investors have provided construction plans and parts lists online for DIY scientists who wish to build their own version of the robopup. Source: Stanford Putting their own twist on robots that amble through complicated landscapes, the Stanford Student Robotics club’s Extreme Mobility team at Stanford University has developed a four-legged robot that is not only capable of performing acrobatic tricks and traversing challenging terrain but is also designed with reproducibility in mind. Anyone who wants their own version of the robot, dubbed Stanford Doggo, can consult comprehensive plans, code and a supply list that the students have made fr...

Children describe technology that gives them a sense of ambiguity as ‘creepy’

Summary: Children consider technologies that pose an ambiguous threat as ‘creepy’. Researchers pinpoint five aspects of technology that contribute to the feeling of ambiguity, examples of which are lack of control, ominous physical appearance and mimicry. Source: University of Washington Many parents express concerns about privacy and online safety in technology designed for their children. But we know much less about what children themselves find concerning in emerging technologies. Now University of Washington researchers have defined for the first time what children mean when they say technology is “creepy.” Kids in a new study described creepy technology as something that is unpredictable or poses an ambiguous threat that might cause physical harm or threaten an important relationship....

New AI Sees Like a Human, Filling in the Blanks

Summary: A new deep learning system takes glimpses of its surroundings, representing less than 20% of a 360-degree view and infers the rest of the environment. Source: UT Austin Computer scientists at The University of Texas at Austin have taught an artificial intelligence agent how to do something that usually only humans can do—take a few quick glimpses around and infer its whole environment, a skill necessary for the development of effective search-and-rescue robots that one day can improve the effectiveness of dangerous missions. The team, led by professor Kristen Grauman, Ph.D. candidate Santhosh Ramakrishnan and former Ph.D. candidate Dinesh Jayaraman (now at the University of California, Berkeley) published their results today in the journal Science Robotics. Most AI agents—computer...

Room for thought: Brain region that watches for walls identified

Summary: MEG neuroimaging implicates the occipital place area (OPA) in our ability to rapidly sense our surroundings. The findings may advance improving machine learning and robotics technology aimed at mimicking visual processes in the human brain. Source: Zuckerman Institute To move through the world, you need a sense of your surroundings, especially of the constraints that restrict your movement: the walls, ceiling and other barriers that define the geometry of the navigable space around you. And now, a team of neuroscientists has identified an area of the human brain dedicated to perceiving this geometry. This brain region encodes the spatial constraints of a scene, at lightning-fast speeds, and likely contributes to our instant sense of our surroundings; orienting us in space, so we c...

Kandao Introduces New AI Feature to Turn Regular Videos into Super Slow Motions

Written by AZoRoboticsApr 12 2019 Kandao, the leading 360/VR imaging & camera solution company, today introduced a new feature called ‘AI Slow-motion ‘that can turn 30 fps regular videos into max 300 fps super slow motions, using the machine learning system. The new feature is firstly applied into Kandao Obsidian and QooCam, the industry-leading 360 cameras, making both upgraded into cameras that enable to create high-quality slow motion. “Slow-motion offers possibilities to capture special moments in an epic way, but not all camera can do it. The high-fps camera is very expensive, as it requires large memories and is data-intensive, which is even more challenging for 360 cameras with multiple lenses.” Said Dan Chen, Kandao CEO. Related Stories The sample video:...