Dec 06, 2022
Many people think that getting key personal information can be very hard, but it's now becoming easier than ever.
Last week, I was reading how Elon Musk let go of thousands of people at Twitter using an AI-based bot that was trained to reproduce a message using the voice of famous people. I find it clownish but VERY dangerous.
This impersonation technology, used in the wrong hands, could lead to someone obtaining critical information about a business. It's even better than phishing.
Even a combination of both (plus some threat intelligence) could be the perfect weapon to utilize in a malicious campaign.
Imagine this: you record the voice of any C-level position in a company - let’s say Procurement Manager. Train the bot to recognize the voice, and then we call the Finance Manager with it and ask him/her to transfer money to an external party, using a phishing mail which impersonates the Procurement Manager.
Bingo! It's money in the bank.
In fact, it's already happening.
There are even more terrible attacks that I won’t describe, but this will be a weapon of mass destruction.
Conclusion: cybersecurity offensive techniques are evolving very quickly and a black hat can do a lot of harm without the right company controls and processes.
Check everything twice.
Have the correct processes in place in your company before moving a finger.
Train your workforce, family and friends.
AI Voice attack - Social Engineering 3.0. There is no mitigation method for Voice impersonation.
Microsoft Windows Contacts (VCF/Contact/LDAP) syslink control href attribute escape vulnerability (CVE-2022-44666) (0day)j00sean (https://twitter.com/j00sean) July 11, 2023
CVE-2021-38294: Apache Storm Nimbus Command InjectionZeyad Abdelazim June 20, 2023
CVE-2023-21931 & CVE-2023-21839 RCE via post-deserializationMohammad Hussam Alzeyyat June 19, 2023
Have you missed them? The new reports feature is here!Noa Machter May 14, 2023
CVE-2021-45456 Apache Kylin RCE ExploitMohammad Hussam Alzeyyat April 30, 2023