Intelligent CIO LATAM Issue 37 | Page 38

FEATURE : NETWORK SECURITY this means choosing vendors and solutions that consider these threats and offer robust protection against them . For business leaders , it is essential to consider these concerns and take necessary actions to protect customers and their businesses .
Mario Cesar Santos , VP Global Solutions , Aware
• Social engineering attacks : Deepfakes can also be used in social engineering attacks , where attackers manipulate individuals to disclose confidential information or perform harmful actions . For example , a deepfake video could impersonate a CEO instructing an employee to transfer funds to a fraudulent account .
• Disinformation campaigns : Deepfakes can be weaponized in disinformation campaigns to manipulate public opinion . By creating convincing videos of political figures or other public figures saying or doing things they did not do , malicious actors can sow chaos and confusion .
• Identity theft : Deepfakes can be used to steal
someone ’ s identity by creating fake videos or images that appear to be of the individual . This could be used to access confidential accounts or commit other forms of fraud .
• Sabotage and espionage : Deepfakes can also serve purposes of sabotage or espionage . For example , a deepfake video can manipulate a company ’ s stock price or damage its reputation .
Individuals and organizations need to be aware of these risks and take the necessary steps to protect themselves , such as using strong authentication methods ( like biometrics ) and being highly cautious with this type of manipulated media . For consumers ,

ALTHOUGH DEEPFAKES HAVE GARNERED ATTENTION FOR THEIR ENTERTAINMENT AND CREATIVE VALUE , THEY ALSO PRESENT SERIOUS RISKS TO BUSINESSES .

One such action includes integrating biometric authentication technology into existing solutions and offerings . How does biometrics help defend against deepfakes ? Let ’ s list the main possibilities :
• Liveness detection : This is a crucial component of biometric authentication that helps ensure the authenticity of captured biometric data . This technology is designed to detect whether a biometric sample , such as a facial image or voice recording , comes from a living person or a reproduction , manipulation or deepfake . Liveness detection algorithms analyze various factors , such as the presence of natural movements in a facial image or physiological signs in a voice recording , to determine if the biometric data is from a living person . These algorithms also protect against injection or emulation attacks . When it comes to deepfake threats , liveness detection is essential to prevent malicious actors from using static images or pre-recorded videos to spoof biometric authentication systems . By verifying the liveness of the person providing the biometric sample , liveness detection technology helps defend against deepfake attacks and ensures the integrity of the authentication process .
• Behavioral biometrics : This involves analyzing patterns in an individual ’ s behavior , such as typing speed , mouse movements and swipe patterns on a touchscreen device . These behavioral patterns are unique to each individual and can be used to verify their identity . When applied to deepfake detection , behavioral biometrics can help identify anomalies in user behavior that may indicate a video or image has been manipulated .
• Voice recognition : By analyzing various aspects of a person ’ s voice , such as volume , tone and cadence , voice recognition systems can verify the validity of an identity . In the context of deepfake detection , this method can help identify unnatural or inconsistent speech patterns that may indicate a video or audio recording has been manipulated .
• Multimodal biometrics : This involves combining multiple biometric authentication methods to increase security . By using a combination of facial recognition , voice recognition and behavioral biometrics , for example , it is possible to achieve a more robust defense against deepfake threats . By requiring multiple forms of biometric authentication , these systems can make it more difficult for malicious actors to create convincing deepfakes . p
38 INTELLIGENTCIO LATAM www . intelligentcio . com