Cybercriminals are undoubtedly keeping up with the times, fully introducing modern technologies into the processing of potential victims, which only allows them to expand their deception techniques.
11 0
About the fact that scammers do not only edit voicemails messages from various publications on social networks, thus creating fake requests with the help AI (artificial intelligence), writes “NBN”, referring to iTECHua material.
One of the relatively popular technologies on  ;at the moment is “deepfake”, when artificial intelligence generates a rather convincing, but false image of a person’s face, which can be “inserted” into both photos and videos, and used as compromising material against a potential victim.
In addition, fake requests are applied to the “contact list” both in social networks and instant messengers – in 2023, using voice imitation, funds were actually stolen from many accounts of gullible citizens.
That is, first of all, in order not to get “hooked” by a fraudster, it is recommended to contact the person directly, for example, by voice over a mobile connection, and clarify his intention to accept a money transfer, refusing to make hasty conclusions and completing transactions after a single listened message.
We previously wrote about how investing in Bitcoin has become much easier.