Get All Latest Research News done in the field of Physics, Chemistry, Medical Science, Electronics, Space, Environment , Nanotechnology, Computing and More

Showing posts with label Robotics. Show all posts
Showing posts with label Robotics. Show all posts

Tuesday, 5 November 2019

A Russian startup is selling humanoid robots whose appearance may be that of your choice

On the left, the humanoid robot "Robo-C" from Promobot. On the right, the chairman of the board of directors of the company, having served as a model. | Promobot

A Russian startup named Promobot, founded in 2015, has recently put up for sale one of its most successful creations: a realistic humanoid robot, whose appearance can be that of your choice. It's not quite the autonomous android that you could imagine - it can only move the head and neck and has a fixed bust. But society is still a milestone in consumer robotics.

Please support by visiting the ads in the post your little click can help us to keep posting stuff beneficial for general knowlege, please leave a comment if you have any suggestions:
And Please follow us on Twitter 
Thank you 😊

" Everyone will now be able to order a robot with any appearance, for professional or personal use, " Aleksei Iuzhakov, Chairman of Promobot's Board of Directors , said in a press release, encouraging a short time. after the people present to "imagine a replica of Michael Jordan selling basketball uniforms, or William Shakespeare reading his own texts in a museum".


Promobot's Robo-C is not (yet?) Able to walk, but his neck and torso each have three degrees of freedom of movement, reads the website of the StartUp

His face has 18 moving parts, which allow the robot to produce 600 micro-expressions. Its artificial intelligence has 100,000 speech modules. What to give, potentially, realistic expressions to the creation. However, for the moment and according to what we have seen, there is still work to be done on the coherence of the moving parts with the different parts of the face (see video at the end of the article).

Digitization of the personality for a "digital immortality"

" The key moment in [Robo-C's] development is the digitization of the personality and the creation of an individual appearance ," Promobot co-founder Oleg Kivokurtsev told CNBC. " As a result, it would be a kind of digital immortality that we can offer to our customers."

The company also plans to manufacture robots for personal use, as a companion or personal assistant. For example, the robot could control smart home systems (in the same way as existing voice assistants). Credits: Promobot

According to CNBC, Promobot is already manufacturing four Robo-C units for its first customers.

One of the robots ordered will be placed in a government service center, where it will perform several functions (including passport scanning). Another will be a clone of Albert Einstein, for an exhibition on the theme of robotics.


The last two will be clone robots of the father and mother of a family from the Middle East, who wants to use androids to "greet the guests". Rather strange this last case of use ... And you,  what do you think? What would you do with such a robot?

Video presentation of Android Robo-C:


Friday, 1 November 2019

MIT develops modular robots that can move, jump, recognize and coordinate


The rise of robotics in recent years has found many practical applications in everyday life. If individual robots perform certain tasks correctly, swarms of robots are usually more efficient. However, achieving optimal communication and coordination between all robots is a big challenge. In an attempt to remedy this situation, a team at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed a surprisingly simple concept: self-assembled robotic cubes that can overlap, jump into the air, and roll. On the ground.

Please support by visiting the ads in the post your little click can help us to keep posting stuff beneficial for general knowlege, please leave a comment if you have any suggestions:
And Please follow us on Twitter 
Thank you 😊

Six years after the first iteration of the project, robots can now communicate with each other using a barcode-type system on each side of the block, allowing the modules to identify themselves. The Autonomous Fleet of 16 Blocks can now perform simple tasks or behaviors, such as forming a line or wall, following arrows or a light source.

Inside each modular "M-Block" is a flywheel that rotates at 20,000 rpm using kinetic momentum when the steering wheel is braked. On each edge and each face are permanent magnets allowing two cubes to attach to each other.

Interconnected modular robot swarms: many potential applications

The team envisions powerful applications for inspection and possibly disaster response. Imagine a building in flames where a staircase has disappeared. In the future, you can simply throw M-Blocks on the ground and watch them build a temporary staircase to climb the roof or go down to the basement to rescue the victims.

"'M' means movement, magnetism, and magic, " says Daniela Rus, MIT professor and director of CSAIL. " Movement, because the cubes can move by jumping. Magnetism, because they can connect to others using magnets and, once connected, they can move and connect to form structures. Magic, because we do not see any moving parts and the cube seems to be magically driven . "

Swarms of modular robots could be used in many areas: inspection and rescue, manufacturing, public health, construction, etc. Credits: Jason Dorfman / MIT CSAIL

While the mechanism is quite complex inside, the outside is on the contrary much simpler, allowing more robust connections. Beyond inspection and rescue, researchers also imagine using blocks for tasks such as games, manufacturing and care.

" What's unique in our approach is that it's inexpensive, robust, and potentially easier to fit into millions of modules, " says Romanishin. " M-Blocks can move in a general way. Other robotic systems have much more complicated motion mechanisms, which require many steps, but our system is more scalable . "

A displacement by inertial movement
Previous modular robotic systems typically approach movements using modules with small robotic arms called external actuators. These systems require a lot of coordination, even for the simplest movements, with several commands for a jump.

In 2013, the team developed its mechanism for M-Blocks. They created cubes that move using so-called "inertia forces". This means that, instead of using moving arms, the blocks have a mass inside that they "throw" against the side of the module, causing rotation and movement of the block.

Each module can move in four cardinal directions when it is placed on one of the six faces, giving 24 different directions of movement. Without small arms and appendages protruding from blocks, it's much easier for them to stay safe from damage and avoid collisions.

Optimized coordination via barcode communication

On the communication side, other attempts have involved the use of infrared light or radio waves, which can quickly become clumsy: if many robots in a small area are all trying to send signals to each other, quickly leads to confusion. When a system uses radio signals to communicate, they can interfere with each other when there are multiple radio signals in a small volume.

Romanishin has developed algorithms designed to help robots perform simple tasks, or "behaviors," which has led to the idea of ​​a barcode-like system, where robots can detect identity other blocks to which they are connected.

In one experiment, the team ordered the modules to form a line from a random structure, and checked if they could determine the specific way they were connected to each other. Otherwise, they should choose a direction and "roll" until they finish at the end of the line.

Essentially, the blocks used the connection pattern (the way they were connected to each other) to guide the chosen move - and 90% of the M-Blocks managed to form a line.

This video from MIT introduces the recently developed modular robots:



Source

Wednesday, 23 October 2019

Exoskeleton Improves Walking and Running Performance

Robotic clothing


A new "exorroupa" - or a soft robotic exoskeleton - can assist with walking and running, providing significant energy savings in terms of metabolic activity.

Most importantly, robotic clothing allows a net gain of energy in both modalities.

"Robotic exoskeletons tend to be bulky and heavy; and while walking experiences have shown promising results, the energy spent running with the additional weight of the device outweighs the benefits of robotic assistance," said Professor Giuk Lee of the University. Chung-Ang in South Korea.

Please support by visiting the ads in the post your little click can help us to keep posting stuff beneficial for general knowlege, please leave a comment if you have any suggestions: Thank you 😊

The problem is that walking and running have a fundamentally different biomechanics, which makes it challenging to create devices that assist both types of gait.

To address this challenge, the Korean team took two approaches. The first was to manufacture the fabric mainly out of cloth, including the belt and the fastening wraps, which allowed to lower the weight of the entire device to only 5 kg. The second was to create a mechanism that allows the assistance mode to be automatically switched between walking and running to achieve maximum assistance efficiency.

An algorithm analyzes data collected by exoskeleton sensors and classifies gait (walking or running), providing feedback to the device and adjusting the assist mode. Treadmill and outdoor testing revealed that the algorithm was able to correctly identify gait more than 99.98% of the time.


Dressing Robot Assist


The exoskeleton's assistance reduced the energy cost of walking at a speed of 1.5 meters per second (4.8 km per hour) by 9.3%, equivalent to losing 7.4 kg. Energy savings during the race (speed of 2.5 meters per second or 9 km per hour) reached 4%, equivalent to a weight loss of 5.7 kg.

"Although changes in metabolic rate are relatively modest, they are of a magnitude similar to those proven to be sufficient to improve maximal walking and running performance. Therefore, we believe these energy savings could result in proportional increases in maximal performance, for example. , on an outdoor race course, "said Professor Lee.

The robotic suit is designed with people with mobility restrictions in mind, particularly those with limited knee function or above-knee amputees. But it can also be used to assist people without physical disabilities.

"We hope this 'dressing robot' will have many uses, such as aiding rehabilitation training for elderly patients and improving the work efficiency of soldiers or firefighters. In the long run, we imagine this exaggeration hanging in the closet all the time, as well as the clothes we wear everyday, "concludes Lee.



Bibliography:

Article: Reducing the metabolic rate of walking and running with a versatile, portable exosuit
Authors: Jinsoo Kim, Giuk Lee, Roman Heimgartner, Dheepak Arumukhom Revi, Nikos Karavas, Ignacio Galiana, Wing Eckert-Erdheim, Patrick Murphy, David Perry, Nicolas Menard, Kim Choe Dabin, Philippe Malcolm, Conor J. Walsh
Magazine: Science
Vol .: 365, Issue 6454, p. 668-672
DOI: 10.1126 / science.aav7536

Sunday, 23 June 2019

The first non-invasively mind-controlled robotic arm


A team of researchers from Carnegie Mellon University, in collaboration with the University of Minnesota, has made a breakthrough in the field of noninvasive robotic device control. Using a noninvasive brain-computer interface (BCI), researchers have developed the first-ever successful mind-controlled robotic arm exhibiting the ability to continuously track and follow a computer cursor.

Being able to noninvasively control robotic devices using only thoughts will have broad applications, in particular benefiting the lives of paralyzed patients and those with movement disorders.


BCIs have been shown to achieve good performance for controlling robotic devices using only the signals sensed from brain implants. When robotic devices can be controlled with high precision, they can be used to complete a variety of daily tasks. Until now, however, BCIs successful in continuously controlling robotic arms have used invasive brain implants. These implants require a substantial amount of medical and surgical expertise to correctly install and operate, not to mention cost and potential risks to subjects. As such, their use has been limited to just a few clinical cases.

A grand challenge in BCI research is to develop less invasive or even totally noninvasive technology that would allow paralyzed patients to control their environment or robotic limbs using their own “thoughts.” Such noninvasive BCI technology, if successful, would bring such much-needed technology to numerous patients and even potentially to the general population.

However, BCIs that use noninvasive external sensing, rather than brain implants, receive “dirtier” signals, leading to lower resolution and less precise control. Thus, when using only the brain to control a robotic arm, a noninvasive BCI doesn’t stand up to using implanted devices. Despite this, BCI researchers have forged ahead, their eye on the prize of a less- or non-invasive technology that could help patients everywhere on a daily basis.

Bin He, department head and professor of biomedical engineering at Carnegie Mellon University, is achieving that goal, one key discovery at a time.


“There have been major advances in mind controlled robotic devices using brain implants. It’s excellent science,” says He. “But noninvasive is the ultimate goal. Advances in neural decoding and the practical utility of noninvasive robotic arm control will have major implications on the eventual development of noninvasive neurorobotics.”

Using novel sensing and machine learning techniques, He and his lab have been able to access signals deep within the brain, achieving a high resolution of control over a robotic arm. With noninvasive neuroimaging and a novel continuous pursuit paradigm, He is overcoming the noisy EEG signals leading to significantly improve EEG-based neural decoding, and facilitating real-time continuous 2D robotic device control.

Using a noninvasive BCI to control a robotic arm that’s tracking a cursor on a computer screen, for the first time ever, He has shown in human subjects that a robotic arm can now follow the cursor continuously. Whereas robotic arms controlled by humans noninvasively had previously followed a moving cursor in jerky, discrete motions—as though the robotic arm was trying to “catch up” to the brain’s commands—now, the arm follows the cursor in a smooth, continuous path.

In a paper published in Science Robotics, the team established a new framework that addresses and improves upon the “brain” and “computer” components of BCI by increasing user engagement and training, as well as spatial resolution of noninvasive neural data through EEG source imaging.

The paper, “Noninvasive neuroimaging enhances continuous neural tracking for robotic device control,” shows that the team’s unique approach to solving this problem not only enhanced BCI learning by nearly 60% for traditional center-out tasks, it also enhanced continuous tracking of a computer cursor by more than 500%.

"This work represents an important step in noninvasive brain-computer interfaces, a technology which someday may become a pervasive assistive technology aiding everyone, like smartphones."
Bin He, Department Head, Biomedical Engineering

The technology also has applications that could help a variety of people, by offering safe, noninvasive “mind control” of devices that can allow people to interact with and control their environments. The technology has, to date, been tested in 68 able-bodied human subjects (up to 10 sessions for each subject), including virtual device control and controlling of a robotic arm for continuous pursuit. The technology is directly applicable to patients, and the team plans to conduct clinical trials in the near future.

“Despite technical challenges using noninvasive signals, we are fully committed to bringing this safe and economic technology to people who can benefit from it,” says He. “This work represents an important step in noninvasive brain-computer interfaces, a technology which someday may become a pervasive assistive technology aiding everyone, like smartphones.”



Bibliography:

Noninvasive neuroimaging enhances continuous neural tracking for robotic device control B. J. Edelman1, J. Meng2, D. Suma2, C. Zurn1, E. Nagarajan3, B. S. Baxter1, C. C. Cline1, and B. He2,
Science Robotics
Vol. 4, Issue 31, eaaw6844
DOI: 10.1126/scirobotics.aaw6844

Friday, 21 June 2019

Fish-inspired electric camera can sees in total darkness

This is the elephant fish, which sees using a very sharp electric sense. [Image: Maik Dobiey / Uni Bonn]

Electric sense

Robots inspired by African elephantfish ( Gnathonemus petersii ) will be able to see through the dirty waters normally found in areas of environmental disasters or when it is necessary to search for objects in aquatic environments - whether because these objects are lost or because they require maintenance .

Instead of the optical images captured by traditional cameras, engineers and zoologists at the University of Bonn in Germany have designed a camera capable of generating "electric images," in which colors are detected as electrical signatures of objects, just as elephant fish do .

With an electric organ in the tail, the fish generates short electrical pulses up to 80 times per second. Electroreceptors on your skin, and especially on your chin, resembling a trunk, measure how the pulses are modulated by the environment.

With this "electric sense," fish can estimate distances, perceive shapes and materials, and even distinguish between living and dead objects. In fractions of a second, he uses the electric pulses to detect where the mosquito larvae, his favorite prey, are hiding in the bottom of his habitat.

Electrical Camera

Martin Gottwald constructed a bionic camera by drawing on the two types of eletroreceptors that the elephant fish uses in their active electrolocalization: One measures the intensity of the electrical signal and the other the waveform of the pulse.

Combining these two signals, Gottwald discovered that it is possible to produce "electric colors," analogous to the visual colors detected by the human eye, only by means of electrical signals rather than visible light.

"With this bionic electric camera it is possible to photograph 'electric images' of objects without any light, even in an obscure environment, which also allows an analysis of the electrical and spatial properties of the objects represented," says Professor Gerhard von der Emde.

"Additional assessments have shown that electrical images can also be used to determine the 'electrical strokes' of measured objects, which, similar to their optical contours, can provide shape and orientation information," Gottwald said.

Electric field lines that scatter around the camera and a plant stem are shown in bluish-white colors. They run from the back end transmission electrodes to the measuring electrodes in the front area and toward the center part of the (gold) electric camera. [Image: Martin Gottwald / Hendrik Herzog]

Robots, drones and medical applications

Tests have shown that the camera-based system is capable of identifying various natural objects such as fish, plants or wood, as well as artificial objects such as aluminum or plastic spheres or rods.

Because they are not dependent on light, none of the parameters used by the camera is affected by the darkness or turbidity of the water.

In addition to allowing the development of inspection drones or robots to operate in hazy or misty environments, the team sees many other applications for electric cameras, including material control, device monitoring, and medical applications.




Bibliography:

 A bio-inspired electric camera for short-range object inspection in murky waters
Martin Gottwald, Hendrik Herzog, Gerhard von der Emde  
Bioinspiration & Biomimetics 
 DOI: 10.1088 / 1748-3190 / ab08a6

Thursday, 20 June 2019

Exoskeleton protects your back in real time

The equipment runs on the battery, performing all the processing internally. [Image: Fraunhofer IPK]

Orthosis for perfect ergonomics

A multidisciplinary team from the Fraunhofer Institute in Germany has created an exoskeleton that promises to minimize the back pain problems of workers who have to deal with weight bearing or repetitive movements.

Back pain, often disabling, leading to work shortages, mainly affects workers in logistics, manufacturing and services, where physically strenuous movement patterns are part of the daily work routine.

More than an EPI (Personal Protective Equipment), the device is an orthotic, capable of detecting movements in real time and reacting to them to maintain proper posture.

"The unparalleled feature of our soft robotic orthosis is its real-time motion analysis. Specially developed algorithms based on machine learning and artificial intelligence allow ergonomics to be analyzed.

"This distinguishes this bracing from commercially available exoskeletons, which are typically robots that, according to their functional principles, amplify all kinds of movements - even non-ergonomic ones - and only divert the load placed on the user from an overloaded part of the body for a less- demanding area, "explained Professor Henning Schmidt, development coordinator for the equipment, named ErgoJack .

The exoskeleton emits alerts when the worker adopts postures or non-ergonomic movements. Units of inertial measures, embedded in the vest, compare pre-learned movement patterns with the actual movement of the worker and evaluate it in real time. This takes only a few hundred milliseconds. The miniaturized motion sensors are located on the shoulders, back and thighs.

The tests were done at a Ford factory in Germany, and the team is now trying to license the technology to a trading partner.


Wednesday, 19 June 2019

Smelling robot shows how hard it is to imitate a vulture

The robot became able to detect the source of ethanol leakage very efficiently and quickly. [Image: Reza Khodayi-mehr]

Robot nose

Science is far from fully understanding the smell, but electronic noses have tried to imitate this natural sense because it is extremely useful, from the perfume and food industry to the detection of air pollutants and toxic substances.

Researchers at Duke University in the United States were just trying to improve the electronic nose created by the team to give a robot the ability to detect the source of the emission of pollutants and toxic leaks.

They quickly discovered that it is not enough to have the robot "follow your nose".


Moreover, although birds such as vultures and vultures can find their source of food tens of miles away, imitation of nature still does not produce good fruit precisely because we do not understand smell.

"Many approaches that employ robots to locate airborne particles are based on bioinspired but simplistic assumptions or heuristic techniques that drive robots against the wind or follow increasing concentrations.These methods can usually locate only one source in open space and can not estimate other equally important parameters such as release rates, "explains Professor Michael Zavlanos.

Physics of airflow and optimized route

In complex environments, current methods can lead robots to areas where concentrations are expected to be larger by the physics of airflows, not because they are the source of the leak.

The team found the solution precisely by analyzing these airflows in real-time, which allowed us to track the source of an emission more efficiently.

The robot makes a concentration measurement of the compound in question, combines it with the previous measurements and solves an optimization problem to estimate the probability of the location of the emission source. Then it calculates the most promising place to test this probability and goes there to carry out the next measurement, repeating the process until the source is found.

"By combining physics-based models with optimized route planning we can find out where the source is with very few measurements," said Zavlanos. "This is because physics-based models provide correlations between measurements that are not taken into account by purely data-driven approaches, and optimized route planning allows the robot to select those few measurements with the highest information content."

The group is already working to create machine learning algorithms to make their models even more efficient and accurate at the same time. They are also working to extend this idea to programming a fleet of robots to conduct methodical research of a large area. Although it has not yet tested the robotic swarm approach in practice, the team has published simulations that demonstrate its potential.


Bibliography:

 Model-Based Active Source Identification in Complex Environments
 Reza Khodayi-mehr, Wilkins Aquino, Michael M. Zavlanos IEEE
Transactions on Robotics
 DOI: 10.1109 / TRO.2019.2894039
 https://arxiv.org/abs/1706.01603

Sunday, 16 June 2019

Robot-sloth monitors environment in the rhythm of nature

promo of robot sloth covering environment

Lazy robot

Robots seem to always be associated with faster, more agile, stronger behaviors, and so on.

But in nature, nothing is so rushed. Thus, monitoring the environment is a better job for robots that are persistent, smooth, silent, and so on.

It was in thinking that a trio of roboticists from the Georgia Institute of Technology in the USA created a "lazy-robot."

Suspended in a cable network, to cover large areas, the robot only moves to collect data when it detects environmental variations, such as changes in weather or chemical composition of the atmosphere.


Or it can be programmed to crawl slowly through the cables, like the lazy beast that gives it its name, without scaring the area residents it may be trying to observe.

In addition to following the rhythm of nature, the fact of not being shaken all the time without necessity means that the robot spends very little energy. The energy is supplied by batteries, recharged by a small photovoltaic panel, which allows the robot to stay in the environment for long periods of time without maintenance.

Robot with cable movement

The construction of the lazy robot is simple, with all the structural parts made in a 3D printer.

But the movement in a network of cables scattered through a forest gave work.

Robot-sloth monitors environment in the rhythm of nature

Native sloth moving through the cable system where the robot will be tested. [Image: M. Zachariah Peery]

"It's a tricky move and you have to do it right to provide a fail-safe transition. Making sure the switches work well for long periods of time is really the greater challenge, "said researcher Gennaro Notomista.

Mechanically, the sloth robot consists of two bodies connected by an electrically controlled hinge. Each body houses a drive motor, connected to a rim with a tire. The use of locomotion wheels, as in cable cars, is simple, energy-efficient and safer than other types of locomotion studied, the team says.

Robot to observe the nature

"There's a lot we do not know about what actually happens under dense tree-covered areas," explained researcher Magnus Egerstedt. "Most of the time, SlothBot will hang there and, from time to time, it will move to a sunny spot to recharge the battery."

The first real-world tests will be done on a cocoa plantation in Costa Rica, where there is already a population of real lazy critters and a cable system ready.

"The cords used to move the cacao have become a highway for the sloths because the animals find them useful to move," Egerstedt said. "If all goes well, we'll install the SlothBots along the cables to monitor the sloths."


Bibliography:

The SlothBot: A Novel Design for Wire-Traversing Robot Gennaro Notomist, Yousef Emam, Magnus Egerstedt
IEEE Robotics and Automation Letters Vol. 4 Issue: 2
DOI: 10.1109 / LRA.2019.2899593