Darknet Op-Ed: What We Leave Behind is Our Personal Data

lighting the darknet
Your Rating

While browsing the internet, we leave behind us traces of our private and personal information. Such traceable data can lead to the exposure of sensitive information about personal lives, can negatively influence the lives of people who choose to share specific political opinions, or act as whistleblowers.

Anonymity networks are aimed at the protection of such sensitive personal information via separation of individuals’ identities from the content accessed by them on the internet. With an estimated 1.7 million users per day, the Tor network is an idea example within this context. Tor is a unique volunteer driven anonymity network which offers its users strong security attributes such as onion encryption , in addition to low latency transmissions that promote interactive applications e.g. web browsing.  Accordingly, Tor represents a secure means for browsing the internet by everyday users.

Unfortunately, this performance pattern comes at a price of vulnerabilities that can open the door to traffic analysis attacks. Previous research studies presented several types of passive and active forms of traffic analysis attacks that are launched to deanonymize network users via transmitting the traces which get monitored by various nodes across the network. Two main elements can affect the success of traffic analysis attacks:

  1. The adversary utilized a specific group of attack metrics in order to detect relationships existing among the identified network traffic streams. For example, throughout a confirmation attack which ingresses network traffic between the exit relay node and the destination domain of a connection. Via utilization of the attack metrics, the adversary will try to pinpoint similarities between the ingress and egress network traffic to understand the relationship between incoming and outgoing internet traffic streams. If the monitored data can be reliably distinguished by the metric, the adversary would be capable of correlating a user’s identity, and the content he/she accesses online. This facilitates the deanonymization of the internet.
  2. The number of relay nodes which are controlled by the adversary can boost the probability of successively monitoring relevant connections. Routing attacks, on the other hand, improves the chances of an adversary via forcing network connection to route through compromised relay nodes. A study that was conducted in 2016, showed that around 40% of Tor’s network circuits are vulnerable to various forms of traffic analysis attacks, provided that the adversary works on the level of autonomous networks. When a state level adversary is in question, or within the context collusion, the probability of vulnerability rises to up to 85%.

Counteracting Traffic Analysis Attacks

A recently published paper focused on two essential design patterns for anonymity networks. Firstly, their security characteristics predict the expected protection against various forms of traffic analysis attacks. Secondly, the network performance limits the number of applications that can be used. Tor’s mix based countermeasures are considered sufficient if they are successful at reducing the probability of successful attacks on Tor.

There are abstract mix concepts which can be considered potential countermeasures within the described context:

  1. Batch mixes will record all incoming packets at a relay node and then flush a defined percentage of packets following triggering of an event. e.g. expiration of the delay duration or receipt of a predefined number of packets.
  2. Continuous time mixes will randomly assign individual delays to packets within a relay node. Apart from batch mixes, this permits the continuous emission of packets with simultaneous disruption of the relationship between incoming and outgoing traffic packets.
  3. Dummy traffic injection utilizes additional groups of packets for the perturbation of network traffic stream. Such forms of packet injections will not necessarily convey any considerable payload data or lead to pattern disruption without relying on further delays.

All of the aforementioned mixing concepts could be customized via individual parameters such as the rate of injection of dummy packets, delay durations or flush rates.

A recently published paper experimented using these mixing strategies to reduce the probability of success of internet traffic analysis attacks. The results of the study were quite interesting and showed that using mixing strategies, whether individually or in combination, can greatly reduce the probability of success of not only traffic analysis attacks, but also confirmation attacks.

Be the first to comment on "Darknet Op-Ed: What We Leave Behind is Our Personal Data"

Leave a comment

Your email address will not be published.


*