Can you imagine a world where there is no police or law enforcement? What would it mean? More crimes. More greed. More exploitation. Less safety.
The dark web needs no introduction. Hidden deep beneath the internet is the dark web, where crime has no limits and users have no face. Until now. DarkBERT could be the potential savior that limits crime on the dark web.
Before delving deep into what DarkBERT is, here are some stats to give a glimpse at the size and number of illegal activities that take place on the dark web:
So what does all this mean? In one word: trouble!
"Ultimate excellence lies not in winning every battle, but in defeating the enemy without ever fighting," said Sun Tzu in the Art of War.
On the surface level web, we have tools—SIEM solutions integrated with UEBA and SOAR capabilities—to help us achieve that excellence. But what happens on the dark web, where anonymous users exchange sensitive data in encrypted languages and sell stolen PII, forged data from big companies, malware, botnets, and exploit kits?
Detecting and tracking these activities seemed close to impossible until a few days back when a few researchers from South Korea came together to build a generative AI exclusively for the dark web called DarkBERT.
To understand DarkBERT, we first need to understand its predecessor. So, let us rewind a bit and dig a little deeper.
Natural language processing (NLP) is the branch of computer science that humanizes computers, i.e., it helps computer programs understand and interpret the human language, including syntax, semantics, and lexicons. We come across NLP programs on a daily basis: autocorrect, voice assistants, and chatbots. So, how does NLP understand, interpret, and communicate in human language?
For this, it uses the help of language models. There are many language model architectures that exist today—n-grams, feedforward neural networks, and recurring neural networks to name a few. Transformer is yet another example of an architecture based on which many language models have been created. The Bidirectional Encoder Representation from Transformers (BERT) and the Generative Pre-trained Transformer (GPT) are two popular types of transformer architecture.
BERT is a pre-trained language based on the transformer architecture. Pre-trained languages are languages that are trained on huge chunks of data. BERT was first introduced by Google in 2018 and pretty soon became a revolutionary language model. This is because, unlike its predecessors, it could interpret data bidirectionally, i.e., it had a better understanding of sentences and the context in which they were used. BERT uses the masked language model (MLM) technique to set itself apart from other language models. MLM masks random words of a sentence and, based on the context as well as the words surrounding the sentence, BERT completes the sentence by inserting the right word.
Based on the BERT framework, researchers from Facebook created a model called Robustly Optimized BERT Approach (RoBERTa). This model was better than BERT because it was trained on a data set almost 10 times larger. Moreover, RoBERTa was equipped with a better MLM technique, as the training involved masking text multiple times rather than just once. The training also included next sentence prediction, which enabled RoBERTa to predict if two sentences actually went together or not.
Though both these models did extremely well on the surface web, they could never decode the language of the dark web as they were never trained for this purpose. While conversations on the surface web are in human language, the dark web uses encrypted or code languages to exchange anonymous messages. It was RoBERTa's model that served as the base architecture during the development of DarkBERT.
DarkBERT is a pre-trained language model that has been trained on 2.2TB of data collected from multiple websites on Tor. The websites from which the data was collected contained information that was sensitive and could be harmful if exposed. To avoid any exposure, the data used for training was filtered, balanced categorically (an equal amount of data from the different categories was selected), de-duplicated (repetitive information was removed), and pre-processed (the data was cleaned to avoid incomplete or sensitive information from being used for training) by the researchers.
DarkBERT can automate the process of detecting threats on the dark web.
Its current capabilities include:
Classifying dark webpages to decide which pages to focus more on.
Identifying a threat from communication between members of groups, i.e., it can detect a possible attack on an organization from the messages and data exchanged on various forums.
Identifying a data breach discussion thread with the help of various keywords.
Detecting threat-related keywords. DarkBERT uses MLM techniques to identify the right word contextually where fill mask functions are applied. (The fill mask model refers to masking certain words in a sentence to ensure AIs are precise in their prediction.)
If a tool like DarkBERT existed a few years back, then the WannaCry ransomware, which caused a staggering loss of $4 billion, would have been just another ransomware picked up by DarkBERT while its attack was being plotted in the deep corners of the web.
With data increasingly being saved on digital devices or on cloud platforms, there is a greater need for security now more than ever before. The introduction of DarkBERT will make organizations want to incorporate dark web audits for their due diligence.
In the future, security analytics and SIEM vendors will begin to offer solutions that can leverage DarkBERT. Here are some possible benefits of a SIEM solution that can do this:
It will be hard to determine what the future holds for cybersecurity tools in an industry as dynamic as information technology. But if there is one thing that remains constant for cybersecurity tools, it is that they need to stay ahead of their enemy at all times. With the introduction of an AI like DarkBERT, the future is full of possibilities, and what may come out of it will only be for the better.
With DarkBERT, the chances of you winning without fighting increase by leaps and bounds.
You will receive regular updates on the latest news on cybersecurity.
© 2021 Zoho Corporation Pvt. Ltd. All rights reserved.