Connect with us

Hi, what are you looking for?

HEADLINES

Robots can extract sensitive information from people who trust them – Kaspersky

Research conducted by Kaspersky and Ghent University has found that robots can effectively extract sensitive information from people who trust them, by persuading them to take unsafe actions.

The social influence of robots on people and the insecurities this can bring should not be underestimated. Research conducted by Kaspersky and Ghent University has found that robots can effectively extract sensitive information from people who trust them, by persuading them to take unsafe actions. For example, in certain scenarios, the presence of a robot can have a big impact on people’s willingness to give out access to secure buildings.

The world is rapidly moving towards increased digitalization and mobility of services, with many industries and households relying strongly on automatization and the use of robotic systems. According to some estimates, the latter will become the norm for wealthy households by 2040. Currently, most of these robotic systems are at the academic research stage and it is too early to discuss how to incorporate cybersecurity measures. However, research by Kaspersky and Ghent University has found a new and unexpected dimension of risk associated with robotics – the social impact it has on people’s behavior, as well as the potential danger and attack vector this brings.  

The research focused on the impact of a specific social robot – one that was designed and programmed to interact with people using human-like channels, such as speech or non-verbal communication, and as many as around 50 participants. Assuming that social robots can be hacked, and that an attacker had taken control in this scenario, the research envisaged the potential security risks related to the robot actively influencing its users to take certain actions including:

  • Gaining access to off-limits premises. The robot was placed near a secure entrance of a mixed-use building in the city center of Ghent, Belgium, and asked the staff if it could follow them through the door. By default, the area can only be accessed by tapping a security pass on the access readers of doors. During the experiment, not all staff complied with the robot’s request, but 40% did unlock the door and keep it open to let the robot into the secured area. However, when the robot was positioned as a pizza delivery person, holding a box from a well-known international take away brand, staff readily accepted the robot’s role and seemed less inclined to question its presence or its reasons for needing access to the secure area.
  • Extracting sensitive information. The second part of the study focused on obtaining personal information which would typically be used to reset passwords (including date of birth, make of first car, favorite color, etc.). Again, the social robot was used, this time inviting people to make friendly conversation. With all but one participant, the researchers managed to obtain personal information at a rate of about one item per minute.

“At the start of the research we examined the software used in robotic system development. Interestingly we found that designers make a conscious decision to exclude security mechanisms and instead focus on the development of comfort and efficiency. However, as the results of our experiment have shown, developers should not forget about security once the research stage is complete,” said  Dmitry Galov, Security Researcher at Kaspersky

 In addition to the technical considerations there are key aspects to be worried about when it comes to the security of robotics.

“We hope that our joint project and foray into the field of cybersecurity robotics with colleagues from the University of Ghent will encourage others to follow our example and raise more public and community awareness of the issue,” added Galov. 

“Scientific literature indicates that trust in robots and specifically social robots is real and can be used to persuade people to take action or reveal information. In general, the more human-like the robot is, the more it has the power to persuade and convince,” commented Tony Belpaeme, Professor in AI and Robotics at Ghent University. 

Advertisement. Scroll to continue reading.

Our experiment has shown that this could carry significant security risks: people tend not to consider them, assuming that the robot is benevolent and trustworthy. This provides a potential conduit for malicious attacks and the three case studies discussed in the report are only a fraction of the security risks associated with social robots. This is why it is crucial to collaborate now to understand and address emerging risks and vulnerabilities – it will pay off in the future,” added  Belpaeme.

Advertisement

Like Us On Facebook

You May Also Like

HEADLINES

Providing a sense of security for its subscribers has pushed Converge to also provide quick access to online support while raising awareness through online...

HEADLINES

Poll shows how people are managing the pandemic as we approach the one-year anniversary of the start of large-scale quarantines that forced millions of...

HEADLINES

The Philippines slipped two notches down from its previous ranking in the latest top 10 global list of countries with the most web-borne threats...

HEADLINES

One of the conclusions of this work is that data constitute the common element on which AI and digital technologies are based. Here, it...

HEADLINES

bluedog was launched two years ago to make professional cybersecurity services accessible to a wider audience – including businesses in Asia and smaller firms...

HEADLINES

The remote setup had employees bringing home their workstations, taking devices out of the protection of cybersecurity systems found inside offices and leaving them...

HEADLINES

Last spring, more than 1 billion schoolchildren around the globe were affected by school closures as countries attempted to slow rising infection rates. For...

HEADLINES

The multibillion-peso investment has enabled the two companies to block 3,020 domains that host illicit materials featuring children as mandated by the National Telecommunications...

Advertisement