With about 422 million people suffering from diabetes, it ranks 7th among the leading causes of death. Regular administration of Blood Glucose plays a vital role in the control of diabetes. The methods widely used for the administration of Blood Glucose involve the process of finger pricking and collection of blood samples. This increases the risk of the spreading of diseases.
After a thorough literature survey on the available Non-invasive Blood Glucose Measurement techniques, a method was devised to extract Blood Glucose data accurately using Near-Infrared Spectroscopy. This will involve acquiring the signal, its amplification, and filtering followed by signal processing and storing of relevant data. A self-monitoring Blood Glucose monitoring kit was used as the reference. For calibration, data from previous iterations of this project was used and a necessary calibration factor was obtained from it.
IntroductionHuman beings derive energy through the internal process of oxidation of glucose. All the complex sugars such as the carbohydrates and fats in a human body are decomposed to glucose, a monosaccharide. Insulin, produced by the pancreas helps in the absorption of glucose and therefore, in the monitoring of blood glucose levels in the body. The fluctuation of glucose levels in the human body can be dangerous. Hence, maintaining optimum levels of blood glucose (88mg/dl-125mg/dl) is required, which when not monitored leads to the onset of Diabetes. Thus, the role of blood glucose monitoring and measurement is particularly important for diabetic therapies.
The most used method for tracking blood glucose levels in the body is by using a ‘Self Blood Glucose Monitoring Kit’. Apart from being costly to use, this device typically requires the patient to prick their finger for testing, which is reported to be a painful experience - deterring them from thoroughly monitoring their blood glucose levels. This led to the development of Non-Invasive techniques such as NIR Spectroscopy to detect glucose levels in diabetic patients. The project aims at using NIR spectroscopy to perform fingertip photoplethysmography.
Plethysmography is a technique of measuring the blood volume changes in any part of the body that results from the pulsation of the heart. These measurements are useful in the diagnosis of arterial obstructions and pulse wave velocity measurements. The PPG technology has been used in a wide range of commercially available devices like pulse oximeters which measure oxygen saturation, blood pressure, and cardiac pulse for assessing other cardiac diseases.
Working PrincipleWhen the light of any wavelength is incident on a molecule, a part of it gets absorbed and the rest is scattered/reflected. The extent to which light gets absorbed or scattered is determined by the wavelength of the incident light and the size of the molecule. Each molecule is known to absorb light at specific wavelengths. The spectrum of the electromagnetic radiation which depicts dark lines or bands due to the absorption of light at specific wavelengths is known as the absorption spectrum. The method of IR spectroscopy is one of the many spectroscopy methods used widely to determine the molecular properties. The IR light belonging to the low energy region of the electromagnetic spectrum, only causes the chemical bonds of the molecule to vibrate in different ways. When the vibrational frequency of the molecule matches the frequency of the incident radiation, the absorption of radiation takes place. The IR absorbance spectroscopy obeys the Beer-Lambert’s Law, which states that “the quantity of light absorbed by the substance dissolved in a fully transmitting solvent is directly proportional to the concentration of the substance and the path length of the light through the solution.”
Hence, as the concentration of the medium through which the IR radiation is passed increases, the absorption of the radiation by the corresponding molecules increases, and the intensity of the transmitted radiation decreases. The output of a photodiode directly depends on the intensity of the incident radiation.
Block DiagramThe output from the analog sensor (TCRT1000) is first passed through the passive High Pass filter. The amplification of the signal and conversion from a current signal to a voltage signal takes place at the trans-impedance amplifier. Further noise elimination takes place at the 2nd order Butterworth Low Pass Filter. The Butterworth filter is chosen to obtain a ripple-free passband and stopband. This gives a cleaner and more accurate output signal. The filtered analog signal is passed through the ADC to convert it into a digital signal. The digital signal is sampled at a frequency of 8-12Hz. The obtained signal data is further filtered using a digital high pass filter to remove low-frequency noise (DC Noise). FFT is performed on this data and data analysis is performed on the data in the frequency domain.
The sensor system consists of the TCRT1000 reflective phototransistor along with its connecting circuitry. The Signal conditioning system consists of 3 Blocks – the High Pass Filter, the Transimpedance Amplifier, and the 2nd Order Low Pass Butterworth Filter. The MCP6004 Op-Amp is used for the various applications in the system.
The passive high pass filter is a circuit used to attenuate signals of frequencies lower than a cut-off frequency. The presence of DC in the output obtained from the photodiode gives it an upward shift and the resultant value is much higher than the ground value.
The trans-impedance amplifier circuit is an operational amplifier circuit with a feedback resistor and capacitor. As the name suggests, it is majorly used to convert a current signal into a voltage signal while simultaneously amplifying it in the required range. The voltage pertaining to the current output of the photodiode has an extremely low magnitude.
The low pass filter is a device used to attenuate signals having a frequency greater than a cut-off frequency. To get a clean, unattenuated signal, it is important to eliminate the high-frequency noises and the 50Hz power line frequency.
Note: I cannot share the PCB Design files and some other data just yet. My research paper is under review, thus I cannot share those files yet. However, this design is no rocket science! You can find a lot of open-sourced files online with ease.
Software Design (AVR IoT WA and AWS)The signal received from the sensor subsystem contains blood glucose specific information, which is analyzed in the frequency domain. First, the analog-to-digital conversion of the signal is carried out using an ADC on-board of the microcontroller (AVR IoT WA). Then, this data is pushed to AWS IoT Core, from where the AWS Lambda function unpacks the JSON data and applies necessary Signal Processing techniques, and the data corresponding to Blood Glucose Level is acquired and displaced on AWS Cloudwatch.
Note: Not having much time or resources at hand, I didn't optimize any code! I have used the template 'AVR Sensor Node' code and modified it for my purpose!
For folks who have never worked with either AVR-IoT and AWS, the part entailing this will be helpful! Others can probably skip to the end.
MPLAB X and AVR-IoT WA Board
Programming the AVR-IoT WA Board is genuinely very simple, and tools offered by MPLAB X IDE ensure that they curtail the efforts of everyone from Beginner to Advanced Embedded developers. Beginners may find a lack of example code or documentation for programming this, but I assure you that is not the case - the whole workflow is quite intuitive and you'll see how. I won't be explaining stuff that is already available on the internet, but I will leave the links to them here!
This Github Repo can really help you a lot, so ensure you follow the 'Map' in the repo thoroughly: https://github.com/microchip-pic-avr-solutions/microchip-iot-developer-guides-for-aws
When you connect the AVR-IoT board for the first time, you'll be required to generate a WiFi Config (which contains your WiFi details) and save them in the 'Curiosity' folder that shows up on your PC. See more over here: https://github.com/microchip-pic-avr-solutions/microchip-iot-developer-guides-for-aws/tree/master/access-the-sandbox
Now, you'll have to connect the board to your AWS Account. This is a fairly straightforward process and you can follow it here: https://github.com/microchip-pic-avr-solutions/microchip-iot-developer-guides-for-aws/tree/master/connect-the-board-to-your-aws-account
Pay close attention to Step 2.2 in the above link. While registering the AWS Credentials, choose one of the following regions only:
- US East (Ohio)
- US East (N. Virginia)
- US West (Oregon)
- Asia Pacific (Singapore)
- Asia Pacific (Sydney)
- Asia Pacific (Tokyo)
- Asia Pacific (Seoul)
- EU (Frankfurt)
- EU (Ireland)
- EU (London)
- China (Beijing)
I had chosen Mumbai as the region while doing this step, as I thought I had to choose my location. But, my board didn't connect to my AWS because of this and I wasted some time trying to understand what went wrong.
Once you do this, fire up your MPLAB X IDE and create a new project. Ensure that 'MCC' and 'Data Visualizer' are installed in the IDE. You may need to download the 'AVR-IoT AWS Sensor Node' as well and add it to MCC.
MCC (MPLAB Code Configurator) is a GUI that generates C code to be inserted into your project. You could graphically see what peripherals are available to you, which you can configure to your needs and simply 'generate' the code, and MCC takes care of all the details. To get a hang of the board, I suggest you use the MCC to generate the code for UART and test a 'Hello World' code. Follow this video to implement this: Getting Started with Foundation Drivers: UART - YouTube. In the video, you'll see that the board used is an Atmega4809, but the same steps can be used for our board as well (Atmega4808) with the exception that you have to choose USART2 for serial communication. Find more about the board here: AVR-IoT_WA_Schematics.pdf (microchip.com)
Once you get the hang of MCC and Data Visualizer you can practically use any peripheral or any service available in the 'Available Resources' window of MCC and build your application.
As mentioned previously, the 'AVR IoT WA Sensor Node' library in MCC will generate all the necessary files for you to build a complete application: from acquiring data to pushing it to the cloud. However, I did something even simpler here. I downloaded an unmodified example file from here: microchip-pic-avr-solutions/avr-iot-aws-sensor-node-mplab (github.com) and added the relevant code to make things 'happen'. The workflow is as follows:
Press SW0 Button -> Interrupt Handler calls 'sendButtonPressToCloud' function -> The function collects 128 data points from ADC, packages it as JSON, and pushes it to cloud on 'buttonPresses' topic -> AWS Lambda Function takes this data and saves it as a list -> This data is processed using Signal Processing algorithm -> Result is published to AWS Cloudwatch.
Once you load the AVRIoT.X project, you have to make a few changes to configure the device to your needs. Here's what I did for my project:
- Created a method named 'sendButtonPressToCloud()', which holds the relevant code to sample data from the sensor, package it (as a string wrapped in JSON format) and send it to the cloud.
- Created the 'dataPayload' variable to hold this string (JSON-formatted).
- Defined a macro called 'SENSOR_ADC_CHANNEL' to set the relevant ADC channel. It is basically just a number. If you want to access channel 7 - set this to '7' - if 5, then set the macro to '5'. Quite straightforward.
- Enabled the Interrupt at SW0 (Switch 0) and attached the sendButtonPressToCloud function to the handler. This is done under 'application_init()' function.
You can go ahead and remove the unnecessary functions and configure the whole set-up to suit your needs. I haven't/couldn't do it because of a time crunch! Once again, sincerest apologies. I will eventually add an optimized code for the project, but for now, I hope it serves the purpose. Either way, that's all there is to this. Make and Build the code, and voila! The AVRIoT board now samples and sends the data to AWS IoT Core on your button press.
Moving to AWS - this is where the data is processed and reported. I'm particularly happy I was able to build this - because until a month ago, I had no idea what AWS is about, and what all it could do (in the IoT domain)! So, once the data streams to 'buttonPresses' topic, AWS Lambda function picks it up and processes it. Follow these steps to make this happen:
- Go to 'AWS Lambda' -> Create a function ->
- Go to 'IoT Core' -> Select 'Act' -> Select 'Rules' -> Click 'Create'
- Give it a Name and an optional Description -> under the 'Rule query statement', enter details in the following format: SELECT jsonTextNameInDataStream FROM 'topicName' AS preferredName. Read more here: AWS IoT SQL reference - AWS IoT Core (amazon.com). In my case, this was the statement: SELECT B FROM 'buttonPresses', as my data was in this format: { "B" : [ x, y, z, a, b,...... ] }
- Under 'Actions', select: Send a message to a Lambda function -> Choose/Create a Lambda function -> Create
- In 'AWS Lambda', you'll see that a trigger from 'AWS IoT Core' is added. Now, add a new layer -> Choose AWS Layers and select the one with NumPy and SciPy (Eg: AWSLambda-Python38-SciPy1). This is done to use packages such as Numpy and SciPy in AWS Lambda.
- Now, type your code -> test with relevant data -> Deploy
- You can observe the results in the 'Monitoring' tab. Click on 'View logs on AWS Cloudwatch' and find your stream under 'Log groups'.
That's the end of Software Design! If there any inconsistencies or issues, please let me know. I will correct them.
ResultsCollecting Data, Calibrating, and Validating - This was the flow in the process. However, with the pandemic going on, I had limited resources to test my prototype and thus limited dataset. This severely affected the calibration process - for which I stuck with a simple linear regression (y = mx + c) - further giving me poor results. I still have to explore various methods and collect more data to calibrate and validate this system, but it does look quite promising. I cannot post the results just yet as my research paper on the same topic is under review. But I will post the link to the paper once it is published!
A lot of people have argued that this method cannot replace the traditional blood glucose measurement, and that is probably true with the current advancements - however, this system surely has a huge potential and of course, an even bigger market, and it will arrive sooner than later!
Here are some of the images of the prototype:
The hardware seen above is not packaged well due to the time constraint of the competition, so apologies for that.
If you have questions or if I might have missed something, please do let me know in the comments. For the hardware design, I recommend you to check out Easy Pulse Sensor (Version 1.1) Overview (Part 1) | Embedded Lab (embedded-lab.com) - Pulse oximeter and our application has pretty much the same circuitry.
Thanks for reading! Cheers.
AcknowledgmentI really want to thank Microchip and Hackster for sending me the free hardware. I couldn't have learned (in such a short time) all these new skills I picked up if it wasn't for the hardware and the competition.
References- https://docs.scipy.org/doc/scipy/reference/generated/scipy.signal.find_peaks.html
- https://docs.aws.amazon.com/lambda/latest/dg/python-package.html
- https://aws.amazon.com/blogs/aws/new-for-aws-lambda-use-any-programming-language-and-share-common-components/
- https://os.mbed.com/cookbook/FIR-Filter
- https://docs.scipy.org/doc/scipy/reference/generated/scipy.signal.butter.html
Comments