Get All Latest Research News done in the field of Physics, Chemistry, Medical Science, Electronics, Space, Environment , Nanotechnology, Computing and More

Showing posts with label Robotics. Show all posts
Showing posts with label Robotics. Show all posts

Tuesday, 5 November 2019

A Russian startup is selling humanoid robots whose appearance may be that of your choice

On the left, the humanoid robot "Robo-C" from Promobot. On the right, the chairman of the board of directors of the company, having served as a model. | Promobot

A Russian startup named Promobot, founded in 2015, has recently put up for sale one of its most successful creations: a realistic humanoid robot, whose appearance can be that of your choice. It's not quite the autonomous android that you could imagine - it can only move the head and neck and has a fixed bust. But society is still a milestone in consumer robotics.

Please support by visiting the ads in the post your little click can help us to keep posting stuff beneficial for general knowlege, please leave a comment if you have any suggestions:
And Please follow us on Twitter 
Thank you 😊

" Everyone will now be able to order a robot with any appearance, for professional or personal use, " Aleksei Iuzhakov, Chairman of Promobot's Board of Directors , said in a press release, encouraging a short time. after the people present to "imagine a replica of Michael Jordan selling basketball uniforms, or William Shakespeare reading his own texts in a museum".

Promobot's Robo-C is not (yet?) Able to walk, but his neck and torso each have three degrees of freedom of movement, reads the website of the StartUp

His face has 18 moving parts, which allow the robot to produce 600 micro-expressions. Its artificial intelligence has 100,000 speech modules. What to give, potentially, realistic expressions to the creation. However, for the moment and according to what we have seen, there is still work to be done on the coherence of the moving parts with the different parts of the face (see video at the end of the article).

Digitization of the personality for a "digital immortality"

" The key moment in [Robo-C's] development is the digitization of the personality and the creation of an individual appearance ," Promobot co-founder Oleg Kivokurtsev told CNBC. " As a result, it would be a kind of digital immortality that we can offer to our customers."

The company also plans to manufacture robots for personal use, as a companion or personal assistant. For example, the robot could control smart home systems (in the same way as existing voice assistants). Credits: Promobot

According to CNBC, Promobot is already manufacturing four Robo-C units for its first customers.

One of the robots ordered will be placed in a government service center, where it will perform several functions (including passport scanning). Another will be a clone of Albert Einstein, for an exhibition on the theme of robotics.

The last two will be clone robots of the father and mother of a family from the Middle East, who wants to use androids to "greet the guests". Rather strange this last case of use ... And you,  what do you think? What would you do with such a robot?

Video presentation of Android Robo-C:

Friday, 1 November 2019

MIT develops modular robots that can move, jump, recognize and coordinate

The rise of robotics in recent years has found many practical applications in everyday life. If individual robots perform certain tasks correctly, swarms of robots are usually more efficient. However, achieving optimal communication and coordination between all robots is a big challenge. In an attempt to remedy this situation, a team at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed a surprisingly simple concept: self-assembled robotic cubes that can overlap, jump into the air, and roll. On the ground.

Please support by visiting the ads in the post your little click can help us to keep posting stuff beneficial for general knowlege, please leave a comment if you have any suggestions:
And Please follow us on Twitter 
Thank you 😊

Six years after the first iteration of the project, robots can now communicate with each other using a barcode-type system on each side of the block, allowing the modules to identify themselves. The Autonomous Fleet of 16 Blocks can now perform simple tasks or behaviors, such as forming a line or wall, following arrows or a light source.

Inside each modular "M-Block" is a flywheel that rotates at 20,000 rpm using kinetic momentum when the steering wheel is braked. On each edge and each face are permanent magnets allowing two cubes to attach to each other.

Interconnected modular robot swarms: many potential applications

The team envisions powerful applications for inspection and possibly disaster response. Imagine a building in flames where a staircase has disappeared. In the future, you can simply throw M-Blocks on the ground and watch them build a temporary staircase to climb the roof or go down to the basement to rescue the victims.

"'M' means movement, magnetism, and magic, " says Daniela Rus, MIT professor and director of CSAIL. " Movement, because the cubes can move by jumping. Magnetism, because they can connect to others using magnets and, once connected, they can move and connect to form structures. Magic, because we do not see any moving parts and the cube seems to be magically driven . "

Swarms of modular robots could be used in many areas: inspection and rescue, manufacturing, public health, construction, etc. Credits: Jason Dorfman / MIT CSAIL

While the mechanism is quite complex inside, the outside is on the contrary much simpler, allowing more robust connections. Beyond inspection and rescue, researchers also imagine using blocks for tasks such as games, manufacturing and care.

" What's unique in our approach is that it's inexpensive, robust, and potentially easier to fit into millions of modules, " says Romanishin. " M-Blocks can move in a general way. Other robotic systems have much more complicated motion mechanisms, which require many steps, but our system is more scalable . "

A displacement by inertial movement
Previous modular robotic systems typically approach movements using modules with small robotic arms called external actuators. These systems require a lot of coordination, even for the simplest movements, with several commands for a jump.

In 2013, the team developed its mechanism for M-Blocks. They created cubes that move using so-called "inertia forces". This means that, instead of using moving arms, the blocks have a mass inside that they "throw" against the side of the module, causing rotation and movement of the block.

Each module can move in four cardinal directions when it is placed on one of the six faces, giving 24 different directions of movement. Without small arms and appendages protruding from blocks, it's much easier for them to stay safe from damage and avoid collisions.

Optimized coordination via barcode communication

On the communication side, other attempts have involved the use of infrared light or radio waves, which can quickly become clumsy: if many robots in a small area are all trying to send signals to each other, quickly leads to confusion. When a system uses radio signals to communicate, they can interfere with each other when there are multiple radio signals in a small volume.

Romanishin has developed algorithms designed to help robots perform simple tasks, or "behaviors," which has led to the idea of ​​a barcode-like system, where robots can detect identity other blocks to which they are connected.

In one experiment, the team ordered the modules to form a line from a random structure, and checked if they could determine the specific way they were connected to each other. Otherwise, they should choose a direction and "roll" until they finish at the end of the line.

Essentially, the blocks used the connection pattern (the way they were connected to each other) to guide the chosen move - and 90% of the M-Blocks managed to form a line.

This video from MIT introduces the recently developed modular robots:


Wednesday, 23 October 2019

Exoskeleton Improves Walking and Running Performance

Robotic clothing

A new "exorroupa" - or a soft robotic exoskeleton - can assist with walking and running, providing significant energy savings in terms of metabolic activity.

Most importantly, robotic clothing allows a net gain of energy in both modalities.

"Robotic exoskeletons tend to be bulky and heavy; and while walking experiences have shown promising results, the energy spent running with the additional weight of the device outweighs the benefits of robotic assistance," said Professor Giuk Lee of the University. Chung-Ang in South Korea.

Please support by visiting the ads in the post your little click can help us to keep posting stuff beneficial for general knowlege, please leave a comment if you have any suggestions: Thank you 😊

The problem is that walking and running have a fundamentally different biomechanics, which makes it challenging to create devices that assist both types of gait.

To address this challenge, the Korean team took two approaches. The first was to manufacture the fabric mainly out of cloth, including the belt and the fastening wraps, which allowed to lower the weight of the entire device to only 5 kg. The second was to create a mechanism that allows the assistance mode to be automatically switched between walking and running to achieve maximum assistance efficiency.

An algorithm analyzes data collected by exoskeleton sensors and classifies gait (walking or running), providing feedback to the device and adjusting the assist mode. Treadmill and outdoor testing revealed that the algorithm was able to correctly identify gait more than 99.98% of the time.

Dressing Robot Assist

The exoskeleton's assistance reduced the energy cost of walking at a speed of 1.5 meters per second (4.8 km per hour) by 9.3%, equivalent to losing 7.4 kg. Energy savings during the race (speed of 2.5 meters per second or 9 km per hour) reached 4%, equivalent to a weight loss of 5.7 kg.

"Although changes in metabolic rate are relatively modest, they are of a magnitude similar to those proven to be sufficient to improve maximal walking and running performance. Therefore, we believe these energy savings could result in proportional increases in maximal performance, for example. , on an outdoor race course, "said Professor Lee.

The robotic suit is designed with people with mobility restrictions in mind, particularly those with limited knee function or above-knee amputees. But it can also be used to assist people without physical disabilities.

"We hope this 'dressing robot' will have many uses, such as aiding rehabilitation training for elderly patients and improving the work efficiency of soldiers or firefighters. In the long run, we imagine this exaggeration hanging in the closet all the time, as well as the clothes we wear everyday, "concludes Lee.


Article: Reducing the metabolic rate of walking and running with a versatile, portable exosuit
Authors: Jinsoo Kim, Giuk Lee, Roman Heimgartner, Dheepak Arumukhom Revi, Nikos Karavas, Ignacio Galiana, Wing Eckert-Erdheim, Patrick Murphy, David Perry, Nicolas Menard, Kim Choe Dabin, Philippe Malcolm, Conor J. Walsh
Magazine: Science
Vol .: 365, Issue 6454, p. 668-672
DOI: 10.1126 / science.aav7536

Sunday, 23 June 2019

The first non-invasively mind-controlled robotic arm

A team of researchers from Carnegie Mellon University, in collaboration with the University of Minnesota, has made a breakthrough in the field of noninvasive robotic device control. Using a noninvasive brain-computer interface (BCI), researchers have developed the first-ever successful mind-controlled robotic arm exhibiting the ability to continuously track and follow a computer cursor.

Being able to noninvasively control robotic devices using only thoughts will have broad applications, in particular benefiting the lives of paralyzed patients and those with movement disorders.

BCIs have been shown to achieve good performance for controlling robotic devices using only the signals sensed from brain implants. When robotic devices can be controlled with high precision, they can be used to complete a variety of daily tasks. Until now, however, BCIs successful in continuously controlling robotic arms have used invasive brain implants. These implants require a substantial amount of medical and surgical expertise to correctly install and operate, not to mention cost and potential risks to subjects. As such, their use has been limited to just a few clinical cases.

A grand challenge in BCI research is to develop less invasive or even totally noninvasive technology that would allow paralyzed patients to control their environment or robotic limbs using their own “thoughts.” Such noninvasive BCI technology, if successful, would bring such much-needed technology to numerous patients and even potentially to the general population.

However, BCIs that use noninvasive external sensing, rather than brain implants, receive “dirtier” signals, leading to lower resolution and less precise control. Thus, when using only the brain to control a robotic arm, a noninvasive BCI doesn’t stand up to using implanted devices. Despite this, BCI researchers have forged ahead, their eye on the prize of a less- or non-invasive technology that could help patients everywhere on a daily basis.

Bin He, department head and professor of biomedical engineering at Carnegie Mellon University, is achieving that goal, one key discovery at a time.

“There have been major advances in mind controlled robotic devices using brain implants. It’s excellent science,” says He. “But noninvasive is the ultimate goal. Advances in neural decoding and the practical utility of noninvasive robotic arm control will have major implications on the eventual development of noninvasive neurorobotics.”

Using novel sensing and machine learning techniques, He and his lab have been able to access signals deep within the brain, achieving a high resolution of control over a robotic arm. With noninvasive neuroimaging and a novel continuous pursuit paradigm, He is overcoming the noisy EEG signals leading to significantly improve EEG-based neural decoding, and facilitating real-time continuous 2D robotic device control.

Using a noninvasive BCI to control a robotic arm that’s tracking a cursor on a computer screen, for the first time ever, He has shown in human subjects that a robotic arm can now follow the cursor continuously. Whereas robotic arms controlled by humans noninvasively had previously followed a moving cursor in jerky, discrete motions—as though the robotic arm was trying to “catch up” to the brain’s commands—now, the arm follows the cursor in a smooth, continuous path.

In a paper published in Science Robotics, the team established a new framework that addresses and improves upon the “brain” and “computer” components of BCI by increasing user engagement and training, as well as spatial resolution of noninvasive neural data through EEG source imaging.

The paper, “Noninvasive neuroimaging enhances continuous neural tracking for robotic device control,” shows that the team’s unique approach to solving this problem not only enhanced BCI learning by nearly 60% for traditional center-out tasks, it also enhanced continuous tracking of a computer cursor by more than 500%.

"This work represents an important step in noninvasive brain-computer interfaces, a technology which someday may become a pervasive assistive technology aiding everyone, like smartphones."
Bin He, Department Head, Biomedical Engineering

The technology also has applications that could help a variety of people, by offering safe, noninvasive “mind control” of devices that can allow people to interact with and control their environments. The technology has, to date, been tested in 68 able-bodied human subjects (up to 10 sessions for each subject), including virtual device control and controlling of a robotic arm for continuous pursuit. The technology is directly applicable to patients, and the team plans to conduct clinical trials in the near future.

“Despite technical challenges using noninvasive signals, we are fully committed to bringing this safe and economic technology to people who can benefit from it,” says He. “This work represents an important step in noninvasive brain-computer interfaces, a technology which someday may become a pervasive assistive technology aiding everyone, like smartphones.”


Noninvasive neuroimaging enhances continuous neural tracking for robotic device control B. J. Edelman1, J. Meng2, D. Suma2, C. Zurn1, E. Nagarajan3, B. S. Baxter1, C. C. Cline1, and B. He2,
Science Robotics
Vol. 4, Issue 31, eaaw6844
DOI: 10.1126/scirobotics.aaw6844

Friday, 21 June 2019

Fish-inspired electric camera can sees in total darkness

This is the elephant fish, which sees using a very sharp electric sense. [Image: Maik Dobiey / Uni Bonn]

Electric sense

Robots inspired by African elephantfish ( Gnathonemus petersii ) will be able to see through the dirty waters normally found in areas of environmental disasters or when it is necessary to search for objects in aquatic environments - whether because these objects are lost or because they require maintenance .

Instead of the optical images captured by traditional cameras, engineers and zoologists at the University of Bonn in Germany have designed a camera capable of generating "electric images," in which colors are detected as electrical signatures of objects, just as elephant fish do .

With an electric organ in the tail, the fish generates short electrical pulses up to 80 times per second. Electroreceptors on your skin, and especially on your chin, resembling a trunk, measure how the pulses are modulated by the environment.

With this "electric sense," fish can estimate distances, perceive shapes and materials, and even distinguish between living and dead objects. In fractions of a second, he uses the electric pulses to detect where the mosquito larvae, his favorite prey, are hiding in the bottom of his habitat.

Electrical Camera

Martin Gottwald constructed a bionic camera by drawing on the two types of eletroreceptors that the elephant fish uses in their active electrolocalization: One measures the intensity of the electrical signal and the other the waveform of the pulse.

Combining these two signals, Gottwald discovered that it is possible to produce "electric colors," analogous to the visual colors detected by the human eye, only by means of electrical signals rather than visible light.

"With this bionic electric camera it is possible to photograph 'electric images' of objects without any light, even in an obscure environment, which also allows an analysis of the electrical and spatial properties of the objects represented," says Professor Gerhard von der Emde.

"Additional assessments have shown that electrical images can also be used to determine the 'electrical strokes' of measured objects, which, similar to their optical contours, can provide shape and orientation information," Gottwald said.

Electric field lines that scatter around the camera and a plant stem are shown in bluish-white colors. They run from the back end transmission electrodes to the measuring electrodes in the front area and toward the center part of the (gold) electric camera. [Image: Martin Gottwald / Hendrik Herzog]

Robots, drones and medical applications

Tests have shown that the camera-based system is capable of identifying various natural objects such as fish, plants or wood, as well as artificial objects such as aluminum or plastic spheres or rods.

Because they are not dependent on light, none of the parameters used by the camera is affected by the darkness or turbidity of the water.

In addition to allowing the development of inspection drones or robots to operate in hazy or misty environments, the team sees many other applications for electric cameras, including material control, device monitoring, and medical applications.


 A bio-inspired electric camera for short-range object inspection in murky waters
Martin Gottwald, Hendrik Herzog, Gerhard von der Emde  
Bioinspiration & Biomimetics 
 DOI: 10.1088 / 1748-3190 / ab08a6