Search
Close this search box.

Blood-inspired random bit generation using microfluidics system – Scientific Reports

Bio-inspired random number generation system

In our setup, a 50-mW green laser with a wavelength of 532 nm (PSU-III-LCD, Changchun New Industries Optoelectronics Tech Co., Ltd., China) directs its beam through a microfluidic chip. Blood flow within the microchannel is brought into focus under a 4X objective lens (Plan N 4X, NA 0.1, Olympus), and a speckle image is then recorded using a CMOS camera (Neo 5.5 sCMOS, Andor Technology Ltd., Belfast, UK. This CMOS camera operates with an exposure time of 0.8 ms and a high frame rate of 1250 frames per second, seamlessly capturing speckle images at a resolution of 128 × 512 pixels for subsequent data generation. To ensure a consistent starting point, data was extracted from 1000 frames, forming a total of 7 sets of experimental data. To preprocess the experimental data, we divided it into 8 partitions, each consisting of 32 × 256pixels, resulting in a total of 56 sets. Through this process, our RNG system, utilizing blood, has been confirmed to generate true random numbers at a rapid speed of 5.5 MHz. To enhance the image contrast, a linear polarizer positioned at a 90-degree angle, an aperture with a 5 mm diameter, and a tube lens with a 180 mm focal length are positioned in front of the camera. The schematic of the entire system is presented in Fig. 2.

Blood sample preparation

Study has approved by the Laboratory Animal Resource Center of the Gwangju Institute of Science and Technology (LARC GIST), as detailed on their website [https://larc.gist.ac.kr/]. These procedures were rigorously followed and formally approved under protocol GIST-2019–015 and all methods are reported in accordance with ARRIVE guidelines for reporting experiments. We collected blood samples from male Sprague Dawley rats aged 12–13 weeks, weighing between 250 and 280 g, by tail vein phlebotomy. For this, we administered 1 ml of blood through a 23G needle while the animals were anesthetized with isoflurane. We conducted these blood collections on a group of 11 rats, ensuring a rest period of at least two weeks between each procedure. The samples were immediately stored in citrate tubes (catalog #363083, 9NC 0.109 M Buffered Trisodium Citrate, BD Vacutainer, USA) for subsequent experiments.

Fabrication of microfluidic device and system operation

To construct the microfluidics channel as shown in Fig. S2, we applied soft photolithography to create channels with precise dimensions: 45 mm in length, 45 µm in height, and 1 mm in width. We made a PDMS slab by a standard process of mixing, degassing, and curing. This slab, composed of PDMS Sylgard 184 A/B (Dow Corning, South Korea), was then bonded to a cover glass via oxygen plasma treatment.

Our microfluidic system, detailed in Fig. 2b, comprised a vacuum generator incorporating a syringe pump in withdrawal mode, a solenoid valve, a 3-way valve, and a 50 ml syringe to account for dead volume. A syringe pump controlled the sample draw at a constant volume of 200 µL in fluctuation withdrawal mode, with a solenoid valve regulating the flow. During the experiment, we intentionally designed it with a large dead volume to stabilize pressure fluctuations. At the microchannel’s outlet, we positioned a 1 mm diameter reservoir and observed it with an sCMOS camera. Before beginning our experiments, we allowed the laser a 5-min warm-up period to ensure stability. The monitoring process started when the blood sample was introduced at the inlet and flowed into the microchannel, reaching the designated ROI measuring 0.8 mm by 3.3 mm. Once in position, the solenoid valve was opened, and the camera began capturing images for analysis as the blood entered the ROI.

Image processing to generate binary keys with a von Neumann extractor

We have devised an algorithm for processing speckle pattern images obtained from blood flow to generate encryption keys. The algorithm converts the original speckle images into binary sequences through a binarization process, which thresholds pixel values against the mean, categorizing them as ‘0’ or ‘1’. Prior to applying our Two-Pass Tuple-Output von Neumann (2P-TO-VN) algorithm, we enhance the rate of bit generation by scrambling the image. This methodology was assessed both with and without image scrambling in Fig. 6a and 6b to determine its efficacy. To mitigate biases and correlations inherent in laser-produced speckle patterns and varying lighting conditions, we employ the following steps in our 2P-TO-VN debiasing algorithm generation:

  1. 1.

    Outputs of ‘00’ or ‘11’ are deleted.

  2. 2.

    If the output is ‘01’ or ‘10’ only the first bit, such as ‘0’ of ‘01’ or ‘1’ of ‘10,’ is retained.

  3. 3.

    In the second pass, bits discarded in the first pass are regrouped into quads, and the front and rear halves of each quad are compared.

  4. 4.

    Discarded bits with different front and rear halves, such as ‘0011’ or ‘1100,’ are preserved

Figure 6
figure 6

Flowchart for unpredictable random number generation using the speckle images. The speckle pattern produced unpredictable random bits consisting of 0 and 1 bits. To improve the high-output and performance, we incorporated additional randomness using image scrambling and 2P-TO-VN debiasing algorithm. (a) Binary image generation process using von Neumann debiasing algorithm (b) Binary image generation process using image scrambling and von Neumann debiasing algorithm. Total Pixels of 2P-TO-VN image and 2P-TO-VN with SCI is 49,980 (60 × 833) pixels and 56,520 (60 × 942) pixels, respectively. It shows that it is affected by scrambling.

While the CVN debiasing method, known as 1 Pass tuple output von Neumann (1P-TO-VN), greatly reduces bit count due to its stringent compression, it has been noted that 2P-TO-VN offers a balance between bit retention and debiasing. By re-evaluating initially discarded bits, the 2P-TO-VN algorithm generates a more feasible volume of data for practical applications, as demonstrated in Fig. S3, making it a superior alternative to CVN for generating robust and usable encryption keys.

NIST statistical randomness tests for random bits generated from speckle images

To determine the quality of randomness of our random bits, we conducted a series of statistical tests using the National Institute of Standards and Technology Statistical Test Suite31. The NIST test suite is designed to quantitatively evaluate the randomness of binary sequences. The NIST test suite comprises 15 distinct tests, each designed to quantitatively measure different aspects of randomness in binary sequences. The tests include assessing frequency, block frequency, runs, longest runs of ones (LRO), serial, approximate entropy, cumulative sums (Cusums). Our evaluation involved aggregating binary sequences from 56 different random bits, ensuring an adequate stream length for seven of the statistical tests.

Speckle decorrelation time measurement

Laser speckle is interference of light after multiple scattering from an optically turbid medium such as whole blood32,33. This laser speckle pattern changes over time as the sample conditions are changed. The speckle pattern data can be collected for time t with the interval of τ. To calculate the electrical field autocorrelation, we used the electrical field autocorrelation function ({g}_{1}left(tau right)). The ({g}_{1}left(tau right)) is defined as,

$${g}_{1}left(tau right)={int }_{0}^{infty }Pleft(sright)expleft[left(-frac{2tau }{{tau }_{o}}right)frac{s}{{l}^{*}}right]$$

(1)

where P(s) is the path length distribution in the sample, s is the path length, τ is the delay time, l* is the transport mean-free path, and τo is the characteristic decay time of the medium. At zero, the autocorrelation is expected to be in a value of 1, which means there are no variation during at this time34. However, as the time lags increases, those values should drop to close to zero, in which time the signal is no longer correlated as compared first time images. For confirmation of the bio-inspired true random bit generation, we measured decorrelation time between the two consecutive images from the time series of speckle pattern images at the point of 50%, as the blood moves through the microchannel.

Data analysis (bit uniformity, average correlation)

To evaluate the performance of random bits, we examine the digitized keys. Bit uniformity assesses how balanced the distribution of ‘0’ and ‘1’ bits is within the random bits. Specifically, it estimates how uniform the ratio of ‘0’s to ‘1’s is.

$$Bit; uniformity=frac{1}{s}sum_{l=1}^{s}{K}_{l}$$

(2)

The uniformity of random bit sequences is often assessed by determining the Hamming weight, which is the count of ‘1’ bits in a binary string of length ‘s’. For a set of random bits, the desired uniformity is achieved when the Hamming weight approaches 0.5, reflecting an equal distribution of ‘0’ and ‘1’ bits, as illustrated in Figs. 4a and S4a. Furthermore, we investigate the correlation between images using a correlation matrix. To ensure a fair comparison, random bits obtained through the von Neumann debiasing algorithm were quantified and matched to create uniform bit matrices. (e.g., original speckle image: 7.8 M bits, CVN image: 1.02 M bits, CVN with SCI: 1.8 M bits, 2P-TO-VN image: 3.3 M bits, 2P-TO-VN with SCI: 4.8 M bits).

$$r=frac{sum_{i=1}^{n}({x}_{i}-overline{x })({y}_{i}-overline{y })}{sqrt{sum_{i=1}^{n}{({x}_{i}-overline{x })}^{2}}sqrt{sum_{i=1}^{n}{({y}_{i}-overline{y })}^{2}}}$$

(3)

We employ the Pearson correlation coefficient formula to quantify the correlation35. In the formula, x̅ and y̅ represent the means of the x and y values. A Pearson correlation coefficient, denoted as r, close to 0 indicates that there is no significant correlation between the two sets of random bits35 in Fig. 4b,d, S4b, and S5.